sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
06e95b7684fa78a0963bffbe5c96143b6ea50f0f |
# Dataset Card for Evaluation run of abhinand/malayalam-llama-7b-instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhinand/malayalam-llama-7b-instruct-v0.1](https://huggingface.co/abhinand/malayalam-llama-7b-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T18:17:56.469419](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1/blob/main/results_2024-01-23T18-17-56.469419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24108004755305937,
"acc_stderr": 0.029966410902590897,
"acc_norm": 0.24113007863184896,
"acc_norm_stderr": 0.030748896883397367,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.47114186838793193,
"mc2_stderr": 0.015324555786314642
},
"harness|arc:challenge|25": {
"acc": 0.3575085324232082,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.3720136518771331,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.5427205735909182,
"acc_stderr": 0.004971534874389941,
"acc_norm": 0.67805218084047,
"acc_norm_stderr": 0.0046626822330937704
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.47114186838793193,
"mc2_stderr": 0.015324555786314642
},
"harness|winogrande|5": {
"acc": 0.6290449881610103,
"acc_stderr": 0.01357639990223157
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1 | [
"region:us"
] | 2024-01-23T15:05:27+00:00 | {"pretty_name": "Evaluation run of abhinand/malayalam-llama-7b-instruct-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhinand/malayalam-llama-7b-instruct-v0.1](https://huggingface.co/abhinand/malayalam-llama-7b-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T18:17:56.469419](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1/blob/main/results_2024-01-23T18-17-56.469419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24108004755305937,\n \"acc_stderr\": 0.029966410902590897,\n \"acc_norm\": 0.24113007863184896,\n \"acc_norm_stderr\": 0.030748896883397367,\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.47114186838793193,\n \"mc2_stderr\": 0.015324555786314642\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3575085324232082,\n \"acc_stderr\": 0.014005494275916573,\n \"acc_norm\": 0.3720136518771331,\n \"acc_norm_stderr\": 0.014124597881844461\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5427205735909182,\n \"acc_stderr\": 0.004971534874389941,\n \"acc_norm\": 0.67805218084047,\n \"acc_norm_stderr\": 0.0046626822330937704\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.47114186838793193,\n \"mc2_stderr\": 0.015324555786314642\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6290449881610103,\n \"acc_stderr\": 0.01357639990223157\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/abhinand/malayalam-llama-7b-instruct-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|arc:challenge|25_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|arc:challenge|25_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|gsm8k|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|gsm8k|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hellaswag|10_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hellaswag|10_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T15-03-09.994795.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["**/details_harness|winogrande|5_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["**/details_harness|winogrande|5_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T18-17-56.469419.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T15_03_09.994795", "path": ["results_2024-01-23T15-03-09.994795.parquet"]}, {"split": "2024_01_23T18_17_56.469419", "path": ["results_2024-01-23T18-17-56.469419.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T18-17-56.469419.parquet"]}]}]} | 2024-01-23T18:20:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhinand/malayalam-llama-7b-instruct-v0.1
Dataset automatically created during the evaluation run of model abhinand/malayalam-llama-7b-instruct-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T18:17:56.469419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhinand/malayalam-llama-7b-instruct-v0.1\n\n\n\nDataset automatically created during the evaluation run of model abhinand/malayalam-llama-7b-instruct-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T18:17:56.469419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhinand/malayalam-llama-7b-instruct-v0.1\n\n\n\nDataset automatically created during the evaluation run of model abhinand/malayalam-llama-7b-instruct-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T18:17:56.469419(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2f774755feb5d359ff0419c1bd042e08b2d2a6de |
# Dataset Card for Evaluation run of Cartinoe5930/DARE-Merging
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Cartinoe5930/DARE-Merging](https://huggingface.co/Cartinoe5930/DARE-Merging) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Cartinoe5930__DARE-Merging",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T15:09:50.545148](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__DARE-Merging/blob/main/results_2024-01-23T15-09-50.545148.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2372712195481079,
"acc_stderr": 0.02996006519093449,
"acc_norm": 0.23747207216886124,
"acc_norm_stderr": 0.030741935495139164,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.48312740729573567,
"mc2_stderr": 0.01641721201782194
},
"harness|arc:challenge|25": {
"acc": 0.20051194539249148,
"acc_stderr": 0.011700318050499377,
"acc_norm": 0.2525597269624573,
"acc_norm_stderr": 0.012696728980207706
},
"harness|hellaswag|10": {
"acc": 0.2581159131647082,
"acc_stderr": 0.004367037632204529,
"acc_norm": 0.26110336586337385,
"acc_norm_stderr": 0.004383384784038464
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510863,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510863
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198823,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198823
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416542,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416542
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.02694748312149623,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.02694748312149623
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21693121693121692,
"acc_stderr": 0.021227082449445062,
"acc_norm": 0.21693121693121692,
"acc_norm_stderr": 0.021227082449445062
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.20967741935483872,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.20967741935483872,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860667,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860667
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148543,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148543
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20168067226890757,
"acc_stderr": 0.026064313406304527,
"acc_norm": 0.20168067226890757,
"acc_norm_stderr": 0.026064313406304527
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3284403669724771,
"acc_stderr": 0.02013590279729839,
"acc_norm": 0.3284403669724771,
"acc_norm_stderr": 0.02013590279729839
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822586,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822586
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22349936143039592,
"acc_stderr": 0.014897235229450708,
"acc_norm": 0.22349936143039592,
"acc_norm_stderr": 0.014897235229450708
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.02298959254312357,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.02298959254312357
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.02405102973991225,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.02405102973991225
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28594771241830064,
"acc_stderr": 0.018280485072954673,
"acc_norm": 0.28594771241830064,
"acc_norm_stderr": 0.018280485072954673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.48312740729573567,
"mc2_stderr": 0.01641721201782194
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612978
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225352
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Cartinoe5930__DARE-Merging | [
"region:us"
] | 2024-01-23T15:12:07+00:00 | {"pretty_name": "Evaluation run of Cartinoe5930/DARE-Merging", "dataset_summary": "Dataset automatically created during the evaluation run of model [Cartinoe5930/DARE-Merging](https://huggingface.co/Cartinoe5930/DARE-Merging) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cartinoe5930__DARE-Merging\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T15:09:50.545148](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__DARE-Merging/blob/main/results_2024-01-23T15-09-50.545148.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2372712195481079,\n \"acc_stderr\": 0.02996006519093449,\n \"acc_norm\": 0.23747207216886124,\n \"acc_norm_stderr\": 0.030741935495139164,\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.48312740729573567,\n \"mc2_stderr\": 0.01641721201782194\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20051194539249148,\n \"acc_stderr\": 0.011700318050499377,\n \"acc_norm\": 0.2525597269624573,\n \"acc_norm_stderr\": 0.012696728980207706\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2581159131647082,\n \"acc_stderr\": 0.004367037632204529,\n \"acc_norm\": 0.26110336586337385,\n \"acc_norm_stderr\": 0.004383384784038464\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510863,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510863\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.02694748312149623,\n \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.02694748312149623\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21693121693121692,\n \"acc_stderr\": 0.021227082449445062,\n \"acc_norm\": 0.21693121693121692,\n \"acc_norm_stderr\": 0.021227082449445062\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036847,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.20967741935483872,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.20967741935483872,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25757575757575757,\n \"acc_stderr\": 0.031156269519646836,\n \"acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.031156269519646836\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860667,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860667\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148543,\n \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148543\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.026064313406304527,\n \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.026064313406304527\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3284403669724771,\n \"acc_stderr\": 0.02013590279729839,\n \"acc_norm\": 0.3284403669724771,\n \"acc_norm_stderr\": 0.02013590279729839\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.3004484304932735,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822586,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822586\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22349936143039592,\n \"acc_stderr\": 0.014897235229450708,\n \"acc_norm\": 0.22349936143039592,\n \"acc_norm_stderr\": 0.014897235229450708\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.02405102973991225,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.02405102973991225\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28594771241830064,\n \"acc_stderr\": 0.018280485072954673,\n \"acc_norm\": 0.28594771241830064,\n \"acc_norm_stderr\": 0.018280485072954673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.031414708025865885,\n \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.031414708025865885\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.48312740729573567,\n \"mc2_stderr\": 0.01641721201782194\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.014044390401612978\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225352\n }\n}\n```", "repo_url": "https://huggingface.co/Cartinoe5930/DARE-Merging", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|arc:challenge|25_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|gsm8k|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hellaswag|10_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T15-09-50.545148.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["**/details_harness|winogrande|5_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T15-09-50.545148.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T15_09_50.545148", "path": ["results_2024-01-23T15-09-50.545148.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T15-09-50.545148.parquet"]}]}]} | 2024-01-23T15:12:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Cartinoe5930/DARE-Merging
Dataset automatically created during the evaluation run of model Cartinoe5930/DARE-Merging on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T15:09:50.545148(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Cartinoe5930/DARE-Merging\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/DARE-Merging on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T15:09:50.545148(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Cartinoe5930/DARE-Merging\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/DARE-Merging on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T15:09:50.545148(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
78ffabbe1c5ff019c49ff37bdadad15ea1625888 |
# Dataset Card for Evaluation run of abhinand/tamil-llama-7b-instruct-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhinand/tamil-llama-7b-instruct-v0.2](https://huggingface.co/abhinand/tamil-llama-7b-instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T18:30:45.482735](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2/blob/main/results_2024-01-23T18-30-45.482735.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.243075402543886,
"acc_stderr": 0.030069028919401566,
"acc_norm": 0.24181008544813296,
"acc_norm_stderr": 0.030751648835495787,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155055,
"mc2": 0.5003889364770407,
"mc2_stderr": 0.015377822106726793
},
"harness|arc:challenge|25": {
"acc": 0.3967576791808874,
"acc_stderr": 0.014296513020180646,
"acc_norm": 0.40187713310580203,
"acc_norm_stderr": 0.01432726861457827
},
"harness|hellaswag|10": {
"acc": 0.5311690898227445,
"acc_stderr": 0.00498007670739244,
"acc_norm": 0.6883091017725552,
"acc_norm_stderr": 0.004622376674166701
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155055,
"mc2": 0.5003889364770407,
"mc2_stderr": 0.015377822106726793
},
"harness|winogrande|5": {
"acc": 0.6677190213101816,
"acc_stderr": 0.013238316554236521
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179564
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2 | [
"region:us"
] | 2024-01-23T15:22:53+00:00 | {"pretty_name": "Evaluation run of abhinand/tamil-llama-7b-instruct-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhinand/tamil-llama-7b-instruct-v0.2](https://huggingface.co/abhinand/tamil-llama-7b-instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T18:30:45.482735](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__tamil-llama-7b-instruct-v0.2/blob/main/results_2024-01-23T18-30-45.482735.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.243075402543886,\n \"acc_stderr\": 0.030069028919401566,\n \"acc_norm\": 0.24181008544813296,\n \"acc_norm_stderr\": 0.030751648835495787,\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155055,\n \"mc2\": 0.5003889364770407,\n \"mc2_stderr\": 0.015377822106726793\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3967576791808874,\n \"acc_stderr\": 0.014296513020180646,\n \"acc_norm\": 0.40187713310580203,\n \"acc_norm_stderr\": 0.01432726861457827\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5311690898227445,\n \"acc_stderr\": 0.00498007670739244,\n \"acc_norm\": 0.6883091017725552,\n \"acc_norm_stderr\": 0.004622376674166701\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155055,\n \"mc2\": 0.5003889364770407,\n \"mc2_stderr\": 0.015377822106726793\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6677190213101816,\n \"acc_stderr\": 0.013238316554236521\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.006298221796179564\n }\n}\n```", "repo_url": "https://huggingface.co/abhinand/tamil-llama-7b-instruct-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|arc:challenge|25_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|arc:challenge|25_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|gsm8k|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|gsm8k|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hellaswag|10_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hellaswag|10_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T15-20-33.725071.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["**/details_harness|winogrande|5_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["**/details_harness|winogrande|5_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T18-30-45.482735.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T15_20_33.725071", "path": ["results_2024-01-23T15-20-33.725071.parquet"]}, {"split": "2024_01_23T18_30_45.482735", "path": ["results_2024-01-23T18-30-45.482735.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T18-30-45.482735.parquet"]}]}]} | 2024-01-23T18:33:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhinand/tamil-llama-7b-instruct-v0.2
Dataset automatically created during the evaluation run of model abhinand/tamil-llama-7b-instruct-v0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T18:30:45.482735(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhinand/tamil-llama-7b-instruct-v0.2\n\n\n\nDataset automatically created during the evaluation run of model abhinand/tamil-llama-7b-instruct-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T18:30:45.482735(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhinand/tamil-llama-7b-instruct-v0.2\n\n\n\nDataset automatically created during the evaluation run of model abhinand/tamil-llama-7b-instruct-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T18:30:45.482735(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4fd380f4c8895fd16eaf0e090c5ab63af36f465a |
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest](https://huggingface.co/abhishekchohan/mistral-7B-forest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T15:31:25.138971](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest/blob/main/results_2024-01-23T15-31-25.138971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6312757004711416,
"acc_stderr": 0.03248074055562836,
"acc_norm": 0.6374796440943401,
"acc_norm_stderr": 0.03314363503746386,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5332124020960498,
"mc2_stderr": 0.01581823670998924
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257182,
"acc_norm": 0.6569965870307167,
"acc_norm_stderr": 0.013872423223718164
},
"harness|hellaswag|10": {
"acc": 0.6794463254331806,
"acc_stderr": 0.004657356402226446,
"acc_norm": 0.8625771758613822,
"acc_norm_stderr": 0.0034358953866922516
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.0242831405294673,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.0242831405294673
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295845,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295845
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611571,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611571
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638193,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206242,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5332124020960498,
"mc2_stderr": 0.01581823670998924
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462057
},
"harness|gsm8k|5": {
"acc": 0.32752084912812734,
"acc_stderr": 0.012927102210426467
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest | [
"region:us"
] | 2024-01-23T15:33:45+00:00 | {"pretty_name": "Evaluation run of abhishekchohan/mistral-7B-forest", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhishekchohan/mistral-7B-forest](https://huggingface.co/abhishekchohan/mistral-7B-forest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T15:31:25.138971](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__mistral-7B-forest/blob/main/results_2024-01-23T15-31-25.138971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6312757004711416,\n \"acc_stderr\": 0.03248074055562836,\n \"acc_norm\": 0.6374796440943401,\n \"acc_norm_stderr\": 0.03314363503746386,\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5332124020960498,\n \"mc2_stderr\": 0.01581823670998924\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257182,\n \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.013872423223718164\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6794463254331806,\n \"acc_stderr\": 0.004657356402226446,\n \"acc_norm\": 0.8625771758613822,\n \"acc_norm_stderr\": 0.0034358953866922516\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.0242831405294673,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.0242831405294673\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295845,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295845\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611571,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611571\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206242,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206242\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5332124020960498,\n \"mc2_stderr\": 0.01581823670998924\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462057\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32752084912812734,\n \"acc_stderr\": 0.012927102210426467\n }\n}\n```", "repo_url": "https://huggingface.co/abhishekchohan/mistral-7B-forest", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|arc:challenge|25_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|gsm8k|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hellaswag|10_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T15-31-25.138971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["**/details_harness|winogrande|5_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T15-31-25.138971.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T15_31_25.138971", "path": ["results_2024-01-23T15-31-25.138971.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T15-31-25.138971.parquet"]}]}]} | 2024-01-23T15:34:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest
Dataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T15:31:25.138971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T15:31:25.138971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhishekchohan/mistral-7B-forest\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/mistral-7B-forest on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T15:31:25.138971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7530bd3c9e86a675e170005ed1837ddc9bea5b52 | # Qianyan Low-Resource NMT Dataset
"千言数据集:低资源语言翻译" ,旨在帮助研究人员和开发者解决低资源语言翻译的问题。该数据集包含了中文和俄文的5万条双语平行语料,以及中文和泰文、中文和越南文各10万条目标端单语语料。
对于泰文和越南文,使用谷歌翻译进行回译,从而生成对应的中文数据。
source=1表示中文到其他语言的翻译,source=0表示其他语言到中文的翻译,以便区分测试集的语言方向。
详见:
https://aistudio.baidu.com/competition/detail/84/0/introduction | miugod/qianyan_nmt | [
"task_categories:translation",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:zh",
"language:ru",
"language:th",
"language:vi",
"region:us"
] | 2024-01-23T16:03:49+00:00 | {"language": ["zh", "ru", "th", "vi"], "size_categories": ["100K<n<1M"], "task_categories": ["translation", "text-generation"], "pretty_name": "qianyan_nmt", "configs": [{"config_name": "zh-ru", "data_files": [{"split": "train", "path": "data/zhru/train.json"}, {"split": "valid", "path": "data/zhru/valid.json"}, {"split": "test", "path": "data/zhru/test.json"}]}, {"config_name": "zh-th", "data_files": [{"split": "train", "path": "data/zhth/train.json"}, {"split": "valid", "path": "data/zhth/valid.json"}, {"split": "test", "path": "data/zhth/test.json"}]}, {"config_name": "zh-vi", "data_files": [{"split": "train", "path": "data/zhvi/train.json"}, {"split": "valid", "path": "data/zhvi/valid.json"}, {"split": "test", "path": "data/zhvi/test.json"}]}]} | 2024-01-23T16:18:33+00:00 | [] | [
"zh",
"ru",
"th",
"vi"
] | TAGS
#task_categories-translation #task_categories-text-generation #size_categories-100K<n<1M #language-Chinese #language-Russian #language-Thai #language-Vietnamese #region-us
| # Qianyan Low-Resource NMT Dataset
"千言数据集:低资源语言翻译" ,旨在帮助研究人员和开发者解决低资源语言翻译的问题。该数据集包含了中文和俄文的5万条双语平行语料,以及中文和泰文、中文和越南文各10万条目标端单语语料。
对于泰文和越南文,使用谷歌翻译进行回译,从而生成对应的中文数据。
source=1表示中文到其他语言的翻译,source=0表示其他语言到中文的翻译,以便区分测试集的语言方向。
详见:
URL | [
"# Qianyan Low-Resource NMT Dataset\n\n\"千言数据集:低资源语言翻译\" ,旨在帮助研究人员和开发者解决低资源语言翻译的问题。该数据集包含了中文和俄文的5万条双语平行语料,以及中文和泰文、中文和越南文各10万条目标端单语语料。\n对于泰文和越南文,使用谷歌翻译进行回译,从而生成对应的中文数据。\nsource=1表示中文到其他语言的翻译,source=0表示其他语言到中文的翻译,以便区分测试集的语言方向。\n详见:\nURL"
] | [
"TAGS\n#task_categories-translation #task_categories-text-generation #size_categories-100K<n<1M #language-Chinese #language-Russian #language-Thai #language-Vietnamese #region-us \n",
"# Qianyan Low-Resource NMT Dataset\n\n\"千言数据集:低资源语言翻译\" ,旨在帮助研究人员和开发者解决低资源语言翻译的问题。该数据集包含了中文和俄文的5万条双语平行语料,以及中文和泰文、中文和越南文各10万条目标端单语语料。\n对于泰文和越南文,使用谷歌翻译进行回译,从而生成对应的中文数据。\nsource=1表示中文到其他语言的翻译,source=0表示其他语言到中文的翻译,以便区分测试集的语言方向。\n详见:\nURL"
] |
526a8012d32cd24d2b43130f13c6789a8635fe1b |
# Dataset Card for Evaluation run of Cartinoe5930/MoE-Merging
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Cartinoe5930/MoE-Merging](https://huggingface.co/Cartinoe5930/MoE-Merging) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Cartinoe5930__MoE-Merging",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T16:02:03.400569](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__MoE-Merging/blob/main/results_2024-01-23T16-02-03.400569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6151505712161501,
"acc_stderr": 0.03298337172536938,
"acc_norm": 0.6177051984375612,
"acc_norm_stderr": 0.03364497168310484,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.5783132294676995,
"mc2_stderr": 0.015725405307929003
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414047,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145677
},
"harness|hellaswag|10": {
"acc": 0.6493726349332802,
"acc_stderr": 0.004761912511707511,
"acc_norm": 0.8458474407488548,
"acc_norm_stderr": 0.0036035695286784127
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.026377567028645858,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.026377567028645858
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.031722334260021565,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.031722334260021565
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092365,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611573,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879706,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959402,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.02563082497562135,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.02563082497562135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787684,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.5783132294676995,
"mc2_stderr": 0.015725405307929003
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.0117056975652052
},
"harness|gsm8k|5": {
"acc": 0.5420773313115997,
"acc_stderr": 0.013723629649844084
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Cartinoe5930__MoE-Merging | [
"region:us"
] | 2024-01-23T16:04:17+00:00 | {"pretty_name": "Evaluation run of Cartinoe5930/MoE-Merging", "dataset_summary": "Dataset automatically created during the evaluation run of model [Cartinoe5930/MoE-Merging](https://huggingface.co/Cartinoe5930/MoE-Merging) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cartinoe5930__MoE-Merging\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T16:02:03.400569](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__MoE-Merging/blob/main/results_2024-01-23T16-02-03.400569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6151505712161501,\n \"acc_stderr\": 0.03298337172536938,\n \"acc_norm\": 0.6177051984375612,\n \"acc_norm_stderr\": 0.03364497168310484,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.5783132294676995,\n \"mc2_stderr\": 0.015725405307929003\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414047,\n \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145677\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6493726349332802,\n \"acc_stderr\": 0.004761912511707511,\n \"acc_norm\": 0.8458474407488548,\n \"acc_norm_stderr\": 0.0036035695286784127\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n \"acc_stderr\": 0.026377567028645858,\n \"acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.026377567028645858\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934833,\n \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934833\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.031722334260021565,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.031722334260021565\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611573,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611573\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879706,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879706\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n \"acc_stderr\": 0.012689708167787684,\n \"acc_norm\": 0.4439374185136897,\n \"acc_norm_stderr\": 0.012689708167787684\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.5783132294676995,\n \"mc2_stderr\": 0.015725405307929003\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.0117056975652052\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5420773313115997,\n \"acc_stderr\": 0.013723629649844084\n }\n}\n```", "repo_url": "https://huggingface.co/Cartinoe5930/MoE-Merging", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|arc:challenge|25_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|gsm8k|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hellaswag|10_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T16-02-03.400569.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["**/details_harness|winogrande|5_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T16-02-03.400569.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T16_02_03.400569", "path": ["results_2024-01-23T16-02-03.400569.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T16-02-03.400569.parquet"]}]}]} | 2024-01-23T16:04:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Cartinoe5930/MoE-Merging
Dataset automatically created during the evaluation run of model Cartinoe5930/MoE-Merging on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T16:02:03.400569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Cartinoe5930/MoE-Merging\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/MoE-Merging on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T16:02:03.400569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Cartinoe5930/MoE-Merging\n\n\n\nDataset automatically created during the evaluation run of model Cartinoe5930/MoE-Merging on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T16:02:03.400569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d615b780ee90d6c16831f579f17a2f73e02f6907 |
# Dataset Card for Evaluation run of AA051612/A0123
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051612/A0123](https://huggingface.co/AA051612/A0123) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051612__A0123",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T16:04:24.583795](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051612__A0123/blob/main/results_2024-01-23T16-04-24.583795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7779570468818447,
"acc_stderr": 0.027463665563981252,
"acc_norm": 0.7837159328019954,
"acc_norm_stderr": 0.027955874516834893,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5842179226306571,
"mc2_stderr": 0.015288978796043103
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600938,
"acc_norm": 0.6766211604095563,
"acc_norm_stderr": 0.013669421630012125
},
"harness|hellaswag|10": {
"acc": 0.6557458673571002,
"acc_stderr": 0.004741534106470289,
"acc_norm": 0.8487353116908982,
"acc_norm_stderr": 0.0035757440987799504
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7925925925925926,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.7925925925925926,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8113207547169812,
"acc_stderr": 0.024079995130062253,
"acc_norm": 0.8113207547169812,
"acc_norm_stderr": 0.024079995130062253
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.02477451625044017,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.02477451625044017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.026556982117838735,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.026556982117838735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.746031746031746,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.746031746031746,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.044154382267437446,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.044154382267437446
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8935483870967742,
"acc_stderr": 0.01754510295165663,
"acc_norm": 0.8935483870967742,
"acc_norm_stderr": 0.01754510295165663
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656187,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656187
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295136,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295136
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8384615384615385,
"acc_stderr": 0.018659703705332972,
"acc_norm": 0.8384615384615385,
"acc_norm_stderr": 0.018659703705332972
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.48518518518518516,
"acc_stderr": 0.030472153249328598,
"acc_norm": 0.48518518518518516,
"acc_norm_stderr": 0.030472153249328598
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8781512605042017,
"acc_stderr": 0.021248144538412016,
"acc_norm": 0.8781512605042017,
"acc_norm_stderr": 0.021248144538412016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5298013245033113,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.5298013245033113,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.010919426411848631,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.010919426411848631
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03179876342176851,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03179876342176851
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.01849831520686538,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.01849831520686538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.916030534351145,
"acc_stderr": 0.024324504024906605,
"acc_norm": 0.916030534351145,
"acc_norm_stderr": 0.024324504024906605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9700854700854701,
"acc_stderr": 0.011160101145288034,
"acc_norm": 0.9700854700854701,
"acc_norm_stderr": 0.011160101145288034
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9220945083014048,
"acc_stderr": 0.009584476076693058,
"acc_norm": 0.9220945083014048,
"acc_norm_stderr": 0.009584476076693058
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7754189944134078,
"acc_stderr": 0.01395680366654464,
"acc_norm": 0.7754189944134078,
"acc_norm_stderr": 0.01395680366654464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8660130718954249,
"acc_stderr": 0.019504890618464815,
"acc_norm": 0.8660130718954249,
"acc_norm_stderr": 0.019504890618464815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8360128617363344,
"acc_stderr": 0.021029576464662695,
"acc_norm": 0.8360128617363344,
"acc_norm_stderr": 0.021029576464662695
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505388,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505388
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.028121636040639896,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.028121636040639896
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6375488917861799,
"acc_stderr": 0.012277512533252499,
"acc_norm": 0.6375488917861799,
"acc_norm_stderr": 0.012277512533252499
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.875,
"acc_stderr": 0.020089743302935947,
"acc_norm": 0.875,
"acc_norm_stderr": 0.020089743302935947
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108568,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108568
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736844,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.01969463855669321,
"acc_norm": 0.96,
"acc_norm_stderr": 0.01969463855669321
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9239766081871345,
"acc_stderr": 0.020327297744388382,
"acc_norm": 0.9239766081871345,
"acc_norm_stderr": 0.020327297744388382
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.01718561172775337,
"mc2": 0.5842179226306571,
"mc2_stderr": 0.015288978796043103
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569572
},
"harness|gsm8k|5": {
"acc": 0.6338134950720242,
"acc_stderr": 0.013270100238748826
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051612__A0123 | [
"region:us"
] | 2024-01-23T16:06:37+00:00 | {"pretty_name": "Evaluation run of AA051612/A0123", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051612/A0123](https://huggingface.co/AA051612/A0123) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051612__A0123\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T16:04:24.583795](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051612__A0123/blob/main/results_2024-01-23T16-04-24.583795.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7779570468818447,\n \"acc_stderr\": 0.027463665563981252,\n \"acc_norm\": 0.7837159328019954,\n \"acc_norm_stderr\": 0.027955874516834893,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5842179226306571,\n \"mc2_stderr\": 0.015288978796043103\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600938,\n \"acc_norm\": 0.6766211604095563,\n \"acc_norm_stderr\": 0.013669421630012125\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6557458673571002,\n \"acc_stderr\": 0.004741534106470289,\n \"acc_norm\": 0.8487353116908982,\n \"acc_norm_stderr\": 0.0035757440987799504\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7925925925925926,\n \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.7925925925925926,\n \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8113207547169812,\n \"acc_stderr\": 0.024079995130062253,\n \"acc_norm\": 0.8113207547169812,\n \"acc_norm_stderr\": 0.024079995130062253\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n \"acc_stderr\": 0.02477451625044017,\n \"acc_norm\": 0.9027777777777778,\n \"acc_norm_stderr\": 0.02477451625044017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838735,\n \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.746031746031746,\n \"acc_stderr\": 0.022418042891113942,\n \"acc_norm\": 0.746031746031746,\n \"acc_norm_stderr\": 0.022418042891113942\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.044154382267437446,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.044154382267437446\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.01754510295165663,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.01754510295165663\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656187,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656187\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295136,\n \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295136\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8384615384615385,\n \"acc_stderr\": 0.018659703705332972,\n \"acc_norm\": 0.8384615384615385,\n \"acc_norm_stderr\": 0.018659703705332972\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.48518518518518516,\n \"acc_stderr\": 0.030472153249328598,\n \"acc_norm\": 0.48518518518518516,\n \"acc_norm_stderr\": 0.030472153249328598\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8781512605042017,\n \"acc_stderr\": 0.021248144538412016,\n \"acc_norm\": 0.8781512605042017,\n \"acc_norm_stderr\": 0.021248144538412016\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5298013245033113,\n \"acc_stderr\": 0.04075224992216979,\n \"acc_norm\": 0.5298013245033113,\n \"acc_norm_stderr\": 0.04075224992216979\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9302752293577982,\n \"acc_stderr\": 0.010919426411848631,\n \"acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.010919426411848631\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176851,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176851\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.01849831520686538,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.01849831520686538\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.916030534351145,\n \"acc_stderr\": 0.024324504024906605,\n \"acc_norm\": 0.916030534351145,\n \"acc_norm_stderr\": 0.024324504024906605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9700854700854701,\n \"acc_stderr\": 0.011160101145288034,\n \"acc_norm\": 0.9700854700854701,\n \"acc_norm_stderr\": 0.011160101145288034\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9220945083014048,\n \"acc_stderr\": 0.009584476076693058,\n \"acc_norm\": 0.9220945083014048,\n \"acc_norm_stderr\": 0.009584476076693058\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7754189944134078,\n \"acc_stderr\": 0.01395680366654464,\n \"acc_norm\": 0.7754189944134078,\n \"acc_norm_stderr\": 0.01395680366654464\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8660130718954249,\n \"acc_stderr\": 0.019504890618464815,\n \"acc_norm\": 0.8660130718954249,\n \"acc_norm_stderr\": 0.019504890618464815\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8360128617363344,\n \"acc_stderr\": 0.021029576464662695,\n \"acc_norm\": 0.8360128617363344,\n \"acc_norm_stderr\": 0.021029576464662695\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505388,\n \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505388\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.028121636040639896,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.028121636040639896\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6375488917861799,\n \"acc_stderr\": 0.012277512533252499,\n \"acc_norm\": 0.6375488917861799,\n \"acc_norm_stderr\": 0.012277512533252499\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.020089743302935947,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.020089743302935947\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108568,\n \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108568\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736844,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9239766081871345,\n \"acc_stderr\": 0.020327297744388382,\n \"acc_norm\": 0.9239766081871345,\n \"acc_norm_stderr\": 0.020327297744388382\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.01718561172775337,\n \"mc2\": 0.5842179226306571,\n \"mc2_stderr\": 0.015288978796043103\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569572\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6338134950720242,\n \"acc_stderr\": 0.013270100238748826\n }\n}\n```", "repo_url": "https://huggingface.co/AA051612/A0123", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|arc:challenge|25_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|gsm8k|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hellaswag|10_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-24.583795.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["**/details_harness|winogrande|5_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T16-04-24.583795.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T16_04_24.583795", "path": ["results_2024-01-23T16-04-24.583795.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T16-04-24.583795.parquet"]}]}]} | 2024-01-23T16:06:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051612/A0123
Dataset automatically created during the evaluation run of model AA051612/A0123 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T16:04:24.583795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051612/A0123\n\n\n\nDataset automatically created during the evaluation run of model AA051612/A0123 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T16:04:24.583795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051612/A0123\n\n\n\nDataset automatically created during the evaluation run of model AA051612/A0123 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T16:04:24.583795(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c0fa36372ab0ac36fc932d6ca6c6c46936025877 |
# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [moreh/MoMo-72B-LoRA-V1.4](https://huggingface.co/moreh/MoMo-72B-LoRA-V1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T16:04:31.974189](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4/blob/main/results_2024-01-23T16-04-31.974189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7688422687772649,
"acc_stderr": 0.028070972631424856,
"acc_norm": 0.772553429938345,
"acc_norm_stderr": 0.028607738168463646,
"mc1": 0.4565483476132191,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6271456347906439,
"mc2_stderr": 0.014869181356225341
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902276,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344
},
"harness|hellaswag|10": {
"acc": 0.6597291376219877,
"acc_stderr": 0.0047283185778352055,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.003563124427458504
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.022691482872035342,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.022691482872035342
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9236111111111112,
"acc_stderr": 0.02221220393834591,
"acc_norm": 0.9236111111111112,
"acc_norm_stderr": 0.02221220393834591
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.04913595201274502,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.04913595201274502
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8068965517241379,
"acc_stderr": 0.03289445522127398,
"acc_norm": 0.8068965517241379,
"acc_norm_stderr": 0.03289445522127398
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6746031746031746,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.6746031746031746,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.01800360332586361,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.01800360332586361
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280459,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280459
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019951,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019951
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909046,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909046
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.01967163241310029,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.01967163241310029
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.03038416923235082,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.03038416923235082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03114144782353605,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03114144782353605
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.018889750550956715,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.018889750550956715
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9195402298850575,
"acc_stderr": 0.009726831316141866,
"acc_norm": 0.9195402298850575,
"acc_norm_stderr": 0.009726831316141866
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.693854748603352,
"acc_stderr": 0.01541449448790321,
"acc_norm": 0.693854748603352,
"acc_norm_stderr": 0.01541449448790321
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.020645597910418777,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.020645597910418777
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.021514051585970397,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.021514051585970397
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.01810541409432967,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.01810541409432967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6453900709219859,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.6453900709219859,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6095176010430248,
"acc_stderr": 0.012460135913945066,
"acc_norm": 0.6095176010430248,
"acc_norm_stderr": 0.012460135913945066
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.022571771025494715,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.022571771025494715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8120915032679739,
"acc_stderr": 0.01580356573677668,
"acc_norm": 0.8120915032679739,
"acc_norm_stderr": 0.01580356573677668
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650153,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136616,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136616
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4565483476132191,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6271456347906439,
"mc2_stderr": 0.014869181356225341
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343345
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.012625423152283042
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4 | [
"region:us"
] | 2024-01-23T16:06:38+00:00 | {"pretty_name": "Evaluation run of moreh/MoMo-72B-LoRA-V1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [moreh/MoMo-72B-LoRA-V1.4](https://huggingface.co/moreh/MoMo-72B-LoRA-V1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T16:04:31.974189](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4/blob/main/results_2024-01-23T16-04-31.974189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7688422687772649,\n \"acc_stderr\": 0.028070972631424856,\n \"acc_norm\": 0.772553429938345,\n \"acc_norm_stderr\": 0.028607738168463646,\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6271456347906439,\n \"mc2_stderr\": 0.014869181356225341\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902276,\n \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6597291376219877,\n \"acc_stderr\": 0.0047283185778352055,\n \"acc_norm\": 0.8500298745269866,\n \"acc_norm_stderr\": 0.003563124427458504\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.022691482872035342,\n \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.022691482872035342\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9236111111111112,\n \"acc_stderr\": 0.02221220393834591,\n \"acc_norm\": 0.9236111111111112,\n \"acc_norm_stderr\": 0.02221220393834591\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.04913595201274502,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.04913595201274502\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8068965517241379,\n \"acc_stderr\": 0.03289445522127398,\n \"acc_norm\": 0.8068965517241379,\n \"acc_norm_stderr\": 0.03289445522127398\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6746031746031746,\n \"acc_stderr\": 0.024130158299762613,\n \"acc_norm\": 0.6746031746031746,\n \"acc_norm_stderr\": 0.024130158299762613\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n \"acc_stderr\": 0.01800360332586361,\n \"acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.01800360332586361\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280459,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280459\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019951,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019951\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909046,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909046\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.01967163241310029,\n \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.01967163241310029\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.03038416923235082,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.03038416923235082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03114144782353605,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03114144782353605\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.018889750550956715,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.018889750550956715\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9195402298850575,\n \"acc_stderr\": 0.009726831316141866,\n \"acc_norm\": 0.9195402298850575,\n \"acc_norm_stderr\": 0.009726831316141866\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.693854748603352,\n \"acc_stderr\": 0.01541449448790321,\n \"acc_norm\": 0.693854748603352,\n \"acc_norm_stderr\": 0.01541449448790321\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.020645597910418777,\n \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.020645597910418777\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n \"acc_stderr\": 0.021514051585970397,\n \"acc_norm\": 0.8263665594855305,\n \"acc_norm_stderr\": 0.021514051585970397\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.01810541409432967,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.01810541409432967\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6453900709219859,\n \"acc_stderr\": 0.02853865002887864,\n \"acc_norm\": 0.6453900709219859,\n \"acc_norm_stderr\": 0.02853865002887864\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6095176010430248,\n \"acc_stderr\": 0.012460135913945066,\n \"acc_norm\": 0.6095176010430248,\n \"acc_norm_stderr\": 0.012460135913945066\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.022571771025494715,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.022571771025494715\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8120915032679739,\n \"acc_stderr\": 0.01580356573677668,\n \"acc_norm\": 0.8120915032679739,\n \"acc_norm_stderr\": 0.01580356573677668\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650153,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4565483476132191,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6271456347906439,\n \"mc2_stderr\": 0.014869181356225341\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343345\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \"acc_stderr\": 0.012625423152283042\n }\n}\n```", "repo_url": "https://huggingface.co/moreh/MoMo-72B-LoRA-V1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|arc:challenge|25_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|gsm8k|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hellaswag|10_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-31.974189.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["**/details_harness|winogrande|5_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T16-04-31.974189.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T16_04_31.974189", "path": ["results_2024-01-23T16-04-31.974189.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T16-04-31.974189.parquet"]}]}]} | 2024-01-23T16:07:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4
Dataset automatically created during the evaluation run of model moreh/MoMo-72B-LoRA-V1.4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T16:04:31.974189(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4\n\n\n\nDataset automatically created during the evaluation run of model moreh/MoMo-72B-LoRA-V1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T16:04:31.974189(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4\n\n\n\nDataset automatically created during the evaluation run of model moreh/MoMo-72B-LoRA-V1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T16:04:31.974189(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6dd0781a7def399fecc66564a939af7096e45502 |
# Summary
`aaditya/databricks-dolly-15k-Hindi` is an open source Hindi version dataset of databricks/databricks-dolly-15k.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode).
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Hindi
Version: 1.0
Original Dataset repo
https://huggingface.co/datasets/databricks/databricks-dolly-15k/edit/main/README.md
# Citation
```
@misc {dolly_hindi,
author = { Pal, Ankit },
title = { databricks-dolly-15k-Hindi},
year = 2024,
url = { https://huggingface.co/datasets/aaditya/databricks-dolly-15k-Hindi },
doi = { 10.57967/hf/1676 },
publisher = { Hugging Face }
}
``` | aaditya/databricks-dolly-15k-Hindi | [
"hindi",
"doi:10.57967/hf/1676",
"region:us"
] | 2024-01-23T16:48:15+00:00 | {"dataset_info": {"features": [{"name": "en_instruction", "dtype": "string"}, {"name": "en_input", "dtype": "string"}, {"name": "en_output", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "en_category", "dtype": "string"}, {"name": "hindi_instruction", "dtype": "string"}, {"name": "hindi_input", "dtype": "string"}, {"name": "hindi_output", "dtype": "string"}, {"name": "hindi_category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 38525353, "num_examples": 15010}], "download_size": 18858317, "dataset_size": 38525353}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["hindi"]} | 2024-01-25T00:20:16+00:00 | [] | [] | TAGS
#hindi #doi-10.57967/hf/1676 #region-us
|
# Summary
'aaditya/databricks-dolly-15k-Hindi' is an open source Hindi version dataset of databricks/databricks-dolly-15k.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the
Creative Commons Attribution-ShareAlike 3.0 Unported License.
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Hindi
Version: 1.0
Original Dataset repo
URL
| [
"# Summary\n'aaditya/databricks-dolly-15k-Hindi' is an open source Hindi version dataset of databricks/databricks-dolly-15k.\n\nThis dataset can be used for any purpose, whether academic or commercial, under the terms of the \nCreative Commons Attribution-ShareAlike 3.0 Unported License.\n\nSupported Tasks: \n- Training LLMs\n- Synthetic Data Generation\n- Data Augmentation\n \nLanguages: Hindi\n\nVersion: 1.0\n\nOriginal Dataset repo\nURL"
] | [
"TAGS\n#hindi #doi-10.57967/hf/1676 #region-us \n",
"# Summary\n'aaditya/databricks-dolly-15k-Hindi' is an open source Hindi version dataset of databricks/databricks-dolly-15k.\n\nThis dataset can be used for any purpose, whether academic or commercial, under the terms of the \nCreative Commons Attribution-ShareAlike 3.0 Unported License.\n\nSupported Tasks: \n- Training LLMs\n- Synthetic Data Generation\n- Data Augmentation\n \nLanguages: Hindi\n\nVersion: 1.0\n\nOriginal Dataset repo\nURL"
] |
161d6cde0ed4344a43ec8e5e76550c00f176b100 |
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T17:21:37.026762](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3/blob/main/results_2024-01-23T17-21-37.026762.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6492734784879889,
"acc_stderr": 0.032252642569362194,
"acc_norm": 0.6486765416619994,
"acc_norm_stderr": 0.03292611927722137,
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7276214821635104,
"mc2_stderr": 0.01481934718420006
},
"harness|arc:challenge|25": {
"acc": 0.7175767918088737,
"acc_stderr": 0.013155456884097222,
"acc_norm": 0.7397610921501706,
"acc_norm_stderr": 0.012821930225112573
},
"harness|hellaswag|10": {
"acc": 0.7389962158932484,
"acc_stderr": 0.0043828441286434296,
"acc_norm": 0.8925512846046604,
"acc_norm_stderr": 0.003090499801090434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469543,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469543
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993462,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993462
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050876,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050876
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7276214821635104,
"mc2_stderr": 0.01481934718420006
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.01012062325227297
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222095
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3 | [
"region:us"
] | 2024-01-23T17:01:37+00:00 | {"pretty_name": "Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3", "dataset_summary": "Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T17:21:37.026762](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3/blob/main/results_2024-01-23T17-21-37.026762.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6492734784879889,\n \"acc_stderr\": 0.032252642569362194,\n \"acc_norm\": 0.6486765416619994,\n \"acc_norm_stderr\": 0.03292611927722137,\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7276214821635104,\n \"mc2_stderr\": 0.01481934718420006\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7175767918088737,\n \"acc_stderr\": 0.013155456884097222,\n \"acc_norm\": 0.7397610921501706,\n \"acc_norm_stderr\": 0.012821930225112573\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7389962158932484,\n \"acc_stderr\": 0.0043828441286434296,\n \"acc_norm\": 0.8925512846046604,\n \"acc_norm_stderr\": 0.003090499801090434\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469543,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469543\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993462,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993462\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050876,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050876\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7276214821635104,\n \"mc2_stderr\": 0.01481934718420006\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.01012062325227297\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \"acc_stderr\": 0.013023665136222095\n }\n}\n```", "repo_url": "https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|arc:challenge|25_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|gsm8k|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hellaswag|10_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T16-59-15.896030.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-21-37.026762.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["**/details_harness|winogrande|5_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["**/details_harness|winogrande|5_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T17-21-37.026762.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T16_59_15.896030", "path": ["results_2024-01-23T16-59-15.896030.parquet"]}, {"split": "2024_01_23T17_21_37.026762", "path": ["results_2024-01-23T17-21-37.026762.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T17-21-37.026762.parquet"]}]}]} | 2024-01-23T17:23:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3
Dataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T17:21:37.026762(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:21:37.026762(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:21:37.026762(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
df2949091cde64ec75f88c3f8738e05895619256 | # Dataset Card for "CSUN_bedroom_VQA_feliu_v2"
 | fformosa/LSUN_bedroom_VQA_v2 | [
"task_categories:question-answering",
"task_categories:zero-shot-classification",
"task_categories:feature-extraction",
"task_categories:visual-question-answering",
"task_categories:image-classification",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | 2024-01-23T17:04:19+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "zero-shot-classification", "feature-extraction", "visual-question-answering", "image-classification"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "size", "sequence": "int64"}, {"name": "proportion", "dtype": "float64"}, {"name": "new_image_id", "dtype": "int64"}, {"name": "new_attributes", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 836782860, "num_examples": 58266}], "download_size": 815988446, "dataset_size": 836782860}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-30T12:36:36+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-zero-shot-classification #task_categories-feature-extraction #task_categories-visual-question-answering #task_categories-image-classification #size_categories-10K<n<100K #language-English #license-mit #region-us
| # Dataset Card for "CSUN_bedroom_VQA_feliu_v2"
!image/png | [
"# Dataset Card for \"CSUN_bedroom_VQA_feliu_v2\"\n\n\n!image/png"
] | [
"TAGS\n#task_categories-question-answering #task_categories-zero-shot-classification #task_categories-feature-extraction #task_categories-visual-question-answering #task_categories-image-classification #size_categories-10K<n<100K #language-English #license-mit #region-us \n",
"# Dataset Card for \"CSUN_bedroom_VQA_feliu_v2\"\n\n\n!image/png"
] |
29accb3babac823b8f63325e67562669913da9b0 |
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T17:04:25.517599](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2/blob/main/results_2024-01-23T17-04-25.517599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6490724115520589,
"acc_stderr": 0.032259594360133925,
"acc_norm": 0.648421745854093,
"acc_norm_stderr": 0.03293377935819422,
"mc1": 0.602203182374541,
"mc1_stderr": 0.01713393424855968,
"mc2": 0.7273114161495677,
"mc2_stderr": 0.014814015917833025
},
"harness|arc:challenge|25": {
"acc": 0.7192832764505119,
"acc_stderr": 0.01313123812697558,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927106
},
"harness|hellaswag|10": {
"acc": 0.7392949611631149,
"acc_stderr": 0.0043812204096411725,
"acc_norm": 0.8924517028480382,
"acc_norm_stderr": 0.003091759094519539
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993459,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530626,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530626
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623553,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.602203182374541,
"mc1_stderr": 0.01713393424855968,
"mc2": 0.7273114161495677,
"mc2_stderr": 0.014814015917833025
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272982
},
"harness|gsm8k|5": {
"acc": 0.6641394996209249,
"acc_stderr": 0.013009224714267362
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2 | [
"region:us"
] | 2024-01-23T17:06:43+00:00 | {"pretty_name": "Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2", "dataset_summary": "Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T17:04:25.517599](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2/blob/main/results_2024-01-23T17-04-25.517599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6490724115520589,\n \"acc_stderr\": 0.032259594360133925,\n \"acc_norm\": 0.648421745854093,\n \"acc_norm_stderr\": 0.03293377935819422,\n \"mc1\": 0.602203182374541,\n \"mc1_stderr\": 0.01713393424855968,\n \"mc2\": 0.7273114161495677,\n \"mc2_stderr\": 0.014814015917833025\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7192832764505119,\n \"acc_stderr\": 0.01313123812697558,\n \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927106\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7392949611631149,\n \"acc_stderr\": 0.0043812204096411725,\n \"acc_norm\": 0.8924517028480382,\n \"acc_norm_stderr\": 0.003091759094519539\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.602203182374541,\n \"mc1_stderr\": 0.01713393424855968,\n \"mc2\": 0.7273114161495677,\n \"mc2_stderr\": 0.014814015917833025\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272982\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6641394996209249,\n \"acc_stderr\": 0.013009224714267362\n }\n}\n```", "repo_url": "https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["**/details_harness|winogrande|5_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T17-04-25.517599.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T17_04_25.517599", "path": ["results_2024-01-23T17-04-25.517599.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T17-04-25.517599.parquet"]}]}]} | 2024-01-23T17:07:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2
Dataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T17:04:25.517599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:04:25.517599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:04:25.517599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ddfb02ae9b403df4d6fb5ce77aced99caf6c59e5 |
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T17:10:17.238798](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed2/blob/main/results_2024-01-23T17-10-17.238798.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.649036082810174,
"acc_stderr": 0.03219249418280251,
"acc_norm": 0.6482917525387473,
"acc_norm_stderr": 0.03286355631611186,
"mc1": 0.5997552019583844,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.7234408992204887,
"mc2_stderr": 0.014869349089766148
},
"harness|arc:challenge|25": {
"acc": 0.7218430034129693,
"acc_stderr": 0.013094469919538804,
"acc_norm": 0.7440273037542662,
"acc_norm_stderr": 0.01275301324124453
},
"harness|hellaswag|10": {
"acc": 0.7394941246763593,
"acc_stderr": 0.00438013646854394,
"acc_norm": 0.8929496116311492,
"acc_norm_stderr": 0.003085454286883946
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653342,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653342
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5997552019583844,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.7234408992204887,
"mc2_stderr": 0.014869349089766148
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028217
},
"harness|gsm8k|5": {
"acc": 0.6762699014404853,
"acc_stderr": 0.012888247397371143
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed2 | [
"region:us"
] | 2024-01-23T17:12:36+00:00 | {"pretty_name": "Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2", "dataset_summary": "Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T17:10:17.238798](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed2/blob/main/results_2024-01-23T17-10-17.238798.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.649036082810174,\n \"acc_stderr\": 0.03219249418280251,\n \"acc_norm\": 0.6482917525387473,\n \"acc_norm_stderr\": 0.03286355631611186,\n \"mc1\": 0.5997552019583844,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.7234408992204887,\n \"mc2_stderr\": 0.014869349089766148\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7218430034129693,\n \"acc_stderr\": 0.013094469919538804,\n \"acc_norm\": 0.7440273037542662,\n \"acc_norm_stderr\": 0.01275301324124453\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7394941246763593,\n \"acc_stderr\": 0.00438013646854394,\n \"acc_norm\": 0.8929496116311492,\n \"acc_norm_stderr\": 0.003085454286883946\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653342,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653342\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5997552019583844,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.7234408992204887,\n \"mc2_stderr\": 0.014869349089766148\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028217\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6762699014404853,\n \"acc_stderr\": 0.012888247397371143\n }\n}\n```", "repo_url": "https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-10-17.238798.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["**/details_harness|winogrande|5_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T17-10-17.238798.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T17_10_17.238798", "path": ["results_2024-01-23T17-10-17.238798.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T17-10-17.238798.parquet"]}]}]} | 2024-01-23T17:13:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2
Dataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T17:10:17.238798(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:10:17.238798(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:10:17.238798(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cb8811bb7a50cfa5ac6f7865fa29f0ab392f996c |
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-ed2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T17:11:58.023501](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-ed2/blob/main/results_2024-01-23T17-11-58.023501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6486907308358316,
"acc_stderr": 0.032182772851276305,
"acc_norm": 0.648316043630535,
"acc_norm_stderr": 0.03285322786437363,
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6448562422390527,
"mc2_stderr": 0.01524548163396442
},
"harness|arc:challenge|25": {
"acc": 0.6800341296928327,
"acc_stderr": 0.013631345807016195,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.013203196088537372
},
"harness|hellaswag|10": {
"acc": 0.6872137024497113,
"acc_stderr": 0.004626805906522212,
"acc_norm": 0.874228241386178,
"acc_norm_stderr": 0.003309142727351092
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903333,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816657,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816657
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653349,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623553,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482708,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482708
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6448562422390527,
"mc2_stderr": 0.01524548163396442
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962524
},
"harness|gsm8k|5": {
"acc": 0.7119029567854435,
"acc_stderr": 0.012474469737197923
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-ed2 | [
"region:us"
] | 2024-01-23T17:14:15+00:00 | {"pretty_name": "Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2", "dataset_summary": "Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-ed2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T17:11:58.023501](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-ed2/blob/main/results_2024-01-23T17-11-58.023501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6486907308358316,\n \"acc_stderr\": 0.032182772851276305,\n \"acc_norm\": 0.648316043630535,\n \"acc_norm_stderr\": 0.03285322786437363,\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6448562422390527,\n \"mc2_stderr\": 0.01524548163396442\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.013631345807016195,\n \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.013203196088537372\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6872137024497113,\n \"acc_stderr\": 0.004626805906522212,\n \"acc_norm\": 0.874228241386178,\n \"acc_norm_stderr\": 0.003309142727351092\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903333,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903333\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482708,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482708\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48959608323133413,\n \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6448562422390527,\n \"mc2_stderr\": 0.01524548163396442\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7119029567854435,\n \"acc_stderr\": 0.012474469737197923\n }\n}\n```", "repo_url": "https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-11-58.023501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["**/details_harness|winogrande|5_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T17-11-58.023501.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T17_11_58.023501", "path": ["results_2024-01-23T17-11-58.023501.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T17-11-58.023501.parquet"]}]}]} | 2024-01-23T17:14:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2
Dataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T17:11:58.023501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:11:58.023501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-ed2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:11:58.023501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b1e9d6c1b2444c1a7f9033f6a9758cd4e94a7e3e | # Dataset Card for Helper-Jhonny Code Mentor
## Dataset Details
### Dataset Description
Helper-Jhonny is a curated dataset designed to support mentoring in code development across three programming languages: Python, JavaScript, and SQL. The dataset is focused on question-answering scenarios related to coding tasks. It aims to assist learners and developers in improving their skills and understanding of these programming languages.
- **Curated by:** Helper-Jhonny Team
- **Funded by [optional]:** Lara Ayne
- **Language(s) (NLP):** Portuguese (pt), English (en), Spanish (es)
- **License:** llama2
### Dataset Sources [optional]
- **Repository:** [Link to Repository]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
### Direct Use
The dataset is suitable for direct use in scenarios where learners and developers seek assistance and guidance in coding tasks. It can be utilized for building applications or platforms that provide real-time code mentoring and support.
### Out-of-Scope Use
The dataset is not intended for misuse or malicious use. It may not work well for non-code related question-answering tasks.
## Dataset Structure
The dataset contains information relevant to code mentoring, including questions and corresponding answers for Python, JavaScript, and SQL. Each entry is tagged with the programming language to facilitate language-specific mentoring.
## Dataset Creation
### Curation Rationale
The dataset was created to address the need for a comprehensive code mentoring resource, focusing on three widely used programming languages. The goal is to provide learners with practical guidance and support in their coding journey.
### Source Data
#### Data Collection and Processing
The dataset comprises questions and answers gathered from various code mentoring sessions. The data selection criteria include relevance to common coding challenges and tasks faced by learners and developers. The data collection process involves curating real-world coding queries and their solutions.
#### Who are the source data producers?
The source data producers are experienced mentors and developers who actively contribute to the code mentoring community.
### Annotations [optional]
#### Annotation process
Annotations include tagging each entry with the respective programming language to ensure language-specific mentoring. Annotators are experienced mentors with expertise in Python, JavaScript, and SQL.
#### Who are the annotators?
Annotations are performed by a team of skilled code mentors with proficiency in Python, JavaScript, and SQL.
#### Personal and Sensitive Information
The dataset does not contain personal, sensitive, or private information.
## Bias, Risks, and Limitations
The dataset may exhibit biases based on the expertise and perspectives of the annotators. Users should be aware that mentoring scenarios may not cover every edge case, and the dataset is not exhaustive.
### Recommendations
Users should be made aware of the dataset's limitations and encouraged to supplement their learning with diverse resources.
| larinhaIA/help-code | [
"region:us"
] | 2024-01-23T17:16:40+00:00 | {} | 2024-01-23T17:24:12+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for Helper-Jhonny Code Mentor
## Dataset Details
### Dataset Description
Helper-Jhonny is a curated dataset designed to support mentoring in code development across three programming languages: Python, JavaScript, and SQL. The dataset is focused on question-answering scenarios related to coding tasks. It aims to assist learners and developers in improving their skills and understanding of these programming languages.
- Curated by: Helper-Jhonny Team
- Funded by [optional]: Lara Ayne
- Language(s) (NLP): Portuguese (pt), English (en), Spanish (es)
- License: llama2
### Dataset Sources [optional]
- Repository: [Link to Repository]
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
The dataset is suitable for direct use in scenarios where learners and developers seek assistance and guidance in coding tasks. It can be utilized for building applications or platforms that provide real-time code mentoring and support.
### Out-of-Scope Use
The dataset is not intended for misuse or malicious use. It may not work well for non-code related question-answering tasks.
## Dataset Structure
The dataset contains information relevant to code mentoring, including questions and corresponding answers for Python, JavaScript, and SQL. Each entry is tagged with the programming language to facilitate language-specific mentoring.
## Dataset Creation
### Curation Rationale
The dataset was created to address the need for a comprehensive code mentoring resource, focusing on three widely used programming languages. The goal is to provide learners with practical guidance and support in their coding journey.
### Source Data
#### Data Collection and Processing
The dataset comprises questions and answers gathered from various code mentoring sessions. The data selection criteria include relevance to common coding challenges and tasks faced by learners and developers. The data collection process involves curating real-world coding queries and their solutions.
#### Who are the source data producers?
The source data producers are experienced mentors and developers who actively contribute to the code mentoring community.
### Annotations [optional]
#### Annotation process
Annotations include tagging each entry with the respective programming language to ensure language-specific mentoring. Annotators are experienced mentors with expertise in Python, JavaScript, and SQL.
#### Who are the annotators?
Annotations are performed by a team of skilled code mentors with proficiency in Python, JavaScript, and SQL.
#### Personal and Sensitive Information
The dataset does not contain personal, sensitive, or private information.
## Bias, Risks, and Limitations
The dataset may exhibit biases based on the expertise and perspectives of the annotators. Users should be aware that mentoring scenarios may not cover every edge case, and the dataset is not exhaustive.
### Recommendations
Users should be made aware of the dataset's limitations and encouraged to supplement their learning with diverse resources.
| [
"# Dataset Card for Helper-Jhonny Code Mentor",
"## Dataset Details",
"### Dataset Description\n\nHelper-Jhonny is a curated dataset designed to support mentoring in code development across three programming languages: Python, JavaScript, and SQL. The dataset is focused on question-answering scenarios related to coding tasks. It aims to assist learners and developers in improving their skills and understanding of these programming languages.\n\n- Curated by: Helper-Jhonny Team\n- Funded by [optional]: Lara Ayne\n- Language(s) (NLP): Portuguese (pt), English (en), Spanish (es)\n- License: llama2",
"### Dataset Sources [optional]\n\n- Repository: [Link to Repository]\n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use\n\nThe dataset is suitable for direct use in scenarios where learners and developers seek assistance and guidance in coding tasks. It can be utilized for building applications or platforms that provide real-time code mentoring and support.",
"### Out-of-Scope Use\n\nThe dataset is not intended for misuse or malicious use. It may not work well for non-code related question-answering tasks.",
"## Dataset Structure\n\nThe dataset contains information relevant to code mentoring, including questions and corresponding answers for Python, JavaScript, and SQL. Each entry is tagged with the programming language to facilitate language-specific mentoring.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was created to address the need for a comprehensive code mentoring resource, focusing on three widely used programming languages. The goal is to provide learners with practical guidance and support in their coding journey.",
"### Source Data",
"#### Data Collection and Processing\n\nThe dataset comprises questions and answers gathered from various code mentoring sessions. The data selection criteria include relevance to common coding challenges and tasks faced by learners and developers. The data collection process involves curating real-world coding queries and their solutions.",
"#### Who are the source data producers?\n\nThe source data producers are experienced mentors and developers who actively contribute to the code mentoring community.",
"### Annotations [optional]",
"#### Annotation process\n\nAnnotations include tagging each entry with the respective programming language to ensure language-specific mentoring. Annotators are experienced mentors with expertise in Python, JavaScript, and SQL.",
"#### Who are the annotators?\n\nAnnotations are performed by a team of skilled code mentors with proficiency in Python, JavaScript, and SQL.",
"#### Personal and Sensitive Information\n\nThe dataset does not contain personal, sensitive, or private information.",
"## Bias, Risks, and Limitations\n\nThe dataset may exhibit biases based on the expertise and perspectives of the annotators. Users should be aware that mentoring scenarios may not cover every edge case, and the dataset is not exhaustive.",
"### Recommendations\n\nUsers should be made aware of the dataset's limitations and encouraged to supplement their learning with diverse resources."
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Helper-Jhonny Code Mentor",
"## Dataset Details",
"### Dataset Description\n\nHelper-Jhonny is a curated dataset designed to support mentoring in code development across three programming languages: Python, JavaScript, and SQL. The dataset is focused on question-answering scenarios related to coding tasks. It aims to assist learners and developers in improving their skills and understanding of these programming languages.\n\n- Curated by: Helper-Jhonny Team\n- Funded by [optional]: Lara Ayne\n- Language(s) (NLP): Portuguese (pt), English (en), Spanish (es)\n- License: llama2",
"### Dataset Sources [optional]\n\n- Repository: [Link to Repository]\n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use\n\nThe dataset is suitable for direct use in scenarios where learners and developers seek assistance and guidance in coding tasks. It can be utilized for building applications or platforms that provide real-time code mentoring and support.",
"### Out-of-Scope Use\n\nThe dataset is not intended for misuse or malicious use. It may not work well for non-code related question-answering tasks.",
"## Dataset Structure\n\nThe dataset contains information relevant to code mentoring, including questions and corresponding answers for Python, JavaScript, and SQL. Each entry is tagged with the programming language to facilitate language-specific mentoring.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset was created to address the need for a comprehensive code mentoring resource, focusing on three widely used programming languages. The goal is to provide learners with practical guidance and support in their coding journey.",
"### Source Data",
"#### Data Collection and Processing\n\nThe dataset comprises questions and answers gathered from various code mentoring sessions. The data selection criteria include relevance to common coding challenges and tasks faced by learners and developers. The data collection process involves curating real-world coding queries and their solutions.",
"#### Who are the source data producers?\n\nThe source data producers are experienced mentors and developers who actively contribute to the code mentoring community.",
"### Annotations [optional]",
"#### Annotation process\n\nAnnotations include tagging each entry with the respective programming language to ensure language-specific mentoring. Annotators are experienced mentors with expertise in Python, JavaScript, and SQL.",
"#### Who are the annotators?\n\nAnnotations are performed by a team of skilled code mentors with proficiency in Python, JavaScript, and SQL.",
"#### Personal and Sensitive Information\n\nThe dataset does not contain personal, sensitive, or private information.",
"## Bias, Risks, and Limitations\n\nThe dataset may exhibit biases based on the expertise and perspectives of the annotators. Users should be aware that mentoring scenarios may not cover every edge case, and the dataset is not exhaustive.",
"### Recommendations\n\nUsers should be made aware of the dataset's limitations and encouraged to supplement their learning with diverse resources."
] |
6642914c25c7e459d0da827a210b607b776f13db | # Multimodal Datasets for Training Python Copilots from Source Code Analysis
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/static/matlok-multimodal-python-copilot-training-datasets-intro-1.jpg" alt="Multimodal Datasets for Training Python Copilots from Source Code Analysis" width="500" style="display: block; margin: auto;"/>
Welcome to the matlok multimodal python copilot training datasets. This is an overview for our training and fine-tuning datasets found below:
- ~2.3M unique source coding rows
- 1.1M+ instruct alpaca yaml text rows updated bi-weekly
- ~923K png knowledge graph images with alpaca text description
- ~334K mp3s over ~2 years of continuous audio playtime
- requires 1.5 TB storage on disk
Please reach out if you find an issue or want help with a similar dataset. We want to make it easier to create and share large datasets:
[email protected]
## Source Code Datasets
The source code datasets used AST to extract all classes, functions, base classes, imports, and source code details from 1258 github repos spanning: ai, ml, compute, infrastructure, and architecture.
- Python source code size on disk (all repos): **146.8 GB**
- Number of python files: 283637
The small dataset is what we use for development and keeping up with the latest repos we are learning.
Dataset Name | Rows | Size
---|---|---
[Small](https://huggingface.co/datasets/matlok/python-copilot-training-on-ai-research-repos) | 514k | **674 MB**
[Large](https://huggingface.co/datasets/matlok/python-copilot-training-from-many-repos-large) | 2.35m | **3.1 GB**
## Text - Instruct Python Copilot Knowledge Graph Alpaca Datasets
With the source code dataset we built the code instruct dataset. Each row contains a training question and answer in alpaca serialized as yaml.
Dataset Name | Rows | Size (GB)
---|---|---
[2024-02-03 - AI Python Coding Instructions](https://huggingface.co/datasets/matlok/python-text-copilot-training-instruct-ai-research-2024-02-03) | 1.18m | **2.1**
[2024-01-27 - AI Python Coding Instructions](https://huggingface.co/datasets/matlok/python-text-copilot-training-instruct-ai-research-2024-01-27) | 1.05m | **1.9**
## Image - Instruct Python Copilot Knowledge Graph Alpaca Datasets
Each row in the image parquet dataset corresponds to a directed knowledge graph saved as a png file. The png file, located in the **dbytes** column, incorporates a descriptive explanation text box written in Alpaca format describing the image using identifiers. The size of the png file is indicated by the **dbytes_len** column. Use the **file_path** column to trace a png back to the original source code repository file.
Dataset Name | Rows | Size (GB)
---|---|---
[How to use class methods](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-class-knowledge-graphs-2024-01-27) | 312k | **294**
[How to set up class inheritance](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-inheritance-knowledge-graphs) | 260k | **135**
[How to use global functions](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-function-knowledge-graphs) | 134k | **130**
[How to import modules](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-import-knowledge-graphs) | 217k | **211**
## Audio - Instruct Python Copilot Knowledge Graph Alpaca Datasets
Each row in the audio parquet dataset contains one narrated Alpaca question or answer, stored as an MP3 file in the **dbytes** column, with its size specified in the **dbytes_len** column. Use the **file_path** column to trace an mp3 back to the original source code repository file.
Dataset Name | Duration | Rows | Size (GB)
---|---|---|---
[How to use class methods](https://huggingface.co/datasets/matlok/python-audio-copilot-training-using-class-knowledge-graphs-2024-01-27) | ~490 days | 135k | **285**
[How to set up class inheritance](https://huggingface.co/datasets/matlok/python-audio-copilot-training-using-inheritance-knowledge-graphs) | ~59 days | 97k | **35**
[How to use global functions](https://huggingface.co/datasets/matlok/python-audio-copilot-training-using-function-knowledge-graphs) | ~218 days | 50k | **126**
[How to import modules](https://huggingface.co/datasets/matlok/python-audio-copilot-training-using-import-knowledge-graphs) | ~104 days | 52k | **60**
## What is in these datasets?
### Image Training Examples
These graphs are focused on a high-level overview of how to use python:
- classes
- base classes
- global functions
- imports
Each graph includes labeled objects, directionality, standardized colors, and a descriptive text box for all drawn objects.
Below are some extracted image samples:
#### Class - Knowledge Graph Images
Here are samples from the [python copilot class image knowledge graph dataset (294.1 GB)](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-class-knowledge-graphs-2024-01-27). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/configuration_clip.py CLIPConfig class
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.class.configuration_clip.CLIPConfig.png" alt="How to use the transformers/src/transformers/models/clip/configuration_clip.py CLIPConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/configuration_clip.py CLIPOnnxConfig class
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.class.configuration_clip.CLIPOnnxConfig.png" alt="How to use the transformers/src/transformers/models/clip/configuration_clip.py CLIPOnnxConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization_clip.py CLIPTokenizer class
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.class.tokenization_clip.CLIPTokenizer.png" alt="How to use the transformers/src/transformers/models/clip/tokenization_clip.py CLIPTokenizer class" width="500" style="display: block; margin: auto;"/>
#### Base Class - Inheritance and Polymorphism Knowledge Graph Images
Here are samples from the [python copilot base class inheritance and polymorphism image knowledge graph dataset (135 GB)](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-inheritance-knowledge-graphs). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/configuration_clip.py CLIPConfig inherited base class(es)
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.base.configuration_clip.CLIPConfig.png" alt="How to use the transformers/src/transformers/models/clip/configuration_clip.py CLIPConfig inherited base class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization_clip_fast.py CLIPTokenizerFast inherited base class(es)
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.base.tokenization_clip_fast.CLIPTokenizerFast.png" alt="How to use the transformers/src/transformers/models/clip/tokenization_clip_fast.py CLIPTokenizerFast inherited base class" width="500" style="display: block; margin: auto;"/>
#### Global Functions - Knowledge Graph Images
Here are samples from the [python copilot global functions image knowledge graph dataset (130 GB)](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-functions-knowledge-graphs). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/convert_clip_original_pytorch_to_hf.py global functions
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.func.convert_clip_original_pytorch_to_hf.png" alt="How to use the transformers/src/transformers/models/clip/convert_clip_original_pytorch_to_hf.py global functions" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization_clip.py global functions
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.func.tokenization_clip.png" alt="How to use the transformers/src/transformers/models/clip/tokenization_clip.py global functions" width="500" style="display: block; margin: auto;"/>
#### Imports - Knowledge Graph Images
Here are samples from the [python copilot imports image knowledge graph dataset (211 GB)](https://huggingface.co/datasets/matlok/python-image-copilot-training-using-import-knowledge-graphs). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/configuration_clip.py imports like the CLIPConfig class
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.import.configuration_clip.CLIPConfig.png" alt="How to use the transformers/src/transformers/models/clip/configuration_clip.py imports like the CLIPConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/configuration_clip.py imports like the CLIPTextConfig class
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.import.configuration_clip.CLIPTextConfig.png" alt="How to use the transformers/src/transformers/models/clip/configuration_clip.py imports like the CLIPTextConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/configuration_clip.py imports like the CLIPVisionConfig class
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.import.configuration_clip.CLIPVisionConfig.png" alt="How to use the transformers/src/transformers/models/clip/configuration_clip.py imports like the CLIPVisionConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization_clip_fast.py imports like the CLIPTokenizerFast class
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/png/transformers/src/transformers/models/clip/image.import.tokenization_clip_fast.CLIPTokenizerFast.png" alt="How to use the transformers/src/transformers/models/clip/tokenization_clip_fast.py imports like the CLIPTokenizerFast class" width="500" style="display: block; margin: auto;"/>
### Audio Training Examples - Question and Answering in Alpaca
Below are extracted question and answer mp3 samples. Each mp3 is either a recording of the alpaca question or answer. Question mp3s use a different speaker than the answer mp3 voice.
Note: mobile browsers have issues playing the mp3s and show a question mark due to markdown failing to show the **Listen** link vs a confusing **?** mark icon sorry!
Question | Answer
--- | ---
Play question run_clip.mp3  | Play answer run_clip.mp3 => 
Play question run_clip.Transform.mp3 =>  | Play answer run_clip.Transform.mp3 => 
Play question run_generation_contrastive_search.mp3 =>  | Play answer run_generation_contrastive_search.mp3 => 
Play question run_generation.mp3 =>  | Play answer run_generation.mp3 => 
Play question checkpointing.mp3 =>  | Play answer checkpointing.mp3 => 
Play question fully_sharded_data_parallel.mp3 =>  | Play answer fully_sharded_data_parallel.mp3 => 
Play question fully_sharded_data_parallel.FullyShardedDataParallel.mp3 =>  | Play answer fully_sharded_data_parallel.FullyShardedDataParallel.mp3 => 
Play question convert-hf-to-gguf.QwenModel.mp3 =>  | Play answer convert-hf-to-gguf.QwenModel.mp3 => 
Play question engine.DeepSpeedEngine.mp3 =>  | Play answer engine.DeepSpeedEngine.mp3 => 
Play question flash_mixtral_modeling.MixtralModel.mp3 =>  | Play answer flash_mixtral_modeling.MixtralModel.mp3 => 
Play question flash_mixtral_modeling.MixtralLayer.mp3 =>  | Play answer flash_mixtral_modeling.MixtralLayer.mp3 => 
Play question flash_mixtral_modeling.MixtralAttention.mp3 =>  | Play answer flash_mixtral_modeling.MixtralAttention.mp3 => 
Play question flash_mixtral_modeling.BlockSparseMoE.mp3 =>  | Play answer flash_mixtral_modeling.BlockSparseMoE.mp3 => 
Play question flash_mixtral_modeling.MixtralModel.mp3 =>  | Play answer flash_mixtral_modeling.MixtralModel.mp3 => 
Play question flash_llama_modeling.FlashLlamaAttention.mp3 =>  | Play answer flash_llama_modeling.FlashLlamaAttention.mp3 => 
Play question flash_llama_modeling.FlashLlamaLayer.mp3 =>  | Play answer flash_llama_modeling.FlashLlamaLayer.mp3 => 
## Schema High Level Design
### Summary
We tried to build the schema to help others maximize available hardware (cpu/storage).
### Background
We use a lot of python multiprocessing pools to concurrently search many parquet files in each of our datasets at once. To help others do the same, we included the **recsize** and **desc_len** columns that provide an estimate for "how long" each row will take to: draw as an image or record as an mp3. With these columns, we are able to maximize our hardware because each worker in the python pool is hacking on a task that is "about the same level of effort" as all the other workers at any given time. With these estimated length columns, we can start using these datasets faster than if we were using a single python process.
### Overview
To find the alpaca training text data, please refer to the **desc** column.
Here's a breakdown of some of the more useful columns:
- **file_path** - identifier for all datasets source code module path
- **desc** - alpaca question and answer yaml
- **desc_len** - length of the alpaca question and answer yaml
- **recsize** - estimated compute time/size of the row for downstream pools
- **name** - name of the file
- **class_name** - name of the class and global if function
- **class_bases** - comma delimited base class name(s)
- **is_member** - bool for is a class member or global function
- **class_docstr** - class docstring
- **class_docstr_tok** - tokenized class docstring
- **docstr** - docsting for the method or function
- **docstr_tok** - tokenized method or function docstring
- **code_tok** - tokenized code
- **lstart** - start line number
- **lend** - end line number
- **code** - ``" __LOKCDR__ "`` delimited code ordered by class method or global function
- **args** - ``" __LOKCDR__ "`` delimited arguments ordered by class method or global function
- **returns** - ``" __LOKCDR__ "`` delimited returns ordered by class method or global function
- **raises** - ``" __LOKCDR__ "`` delimited exception raises ordered by class method or global function
- **method_names** - ``" __LOKCDR__ "`` delimited code ordered by class method name
- **function_names** - ``" __LOKCDR__ "`` delimited code ordered by global function name
- **imports** - ordered imports in the module
- **filename** - name of the file without the directory pathing
- **total_objects** - total objects detected in the file_path
- **num_classes** - number of classes detected in the file_path
- **num_methods** - number of class methods detected in the file_path
- **num_bases** - number of base classes for the class_name definition
- **num_all_bases** - number of all base classes detected in the file_path
- **num_functions** - number of global functions detected in the file_path
- **num_imports** - number of imports detected in the file_path
- **label_id** - shortened tracking label for image knowledge graphs and mp3s
### Reference Schema - All Data Types
Not all columns are casted to the correct types, here is the reference schema when joining all datasets together:
```
{
"active": "bool",
"args": "string",
"args_len": "int64",
"audio_file": "string",
"audio_path": "string",
"class_bases": "string",
"class_name": "string",
"code": "string",
"code_len": "int64",
"desc": "string",
"desc_docstr": "string",
"desc_docstr_len": "int64",
"desc_len": "int64",
"docstr": "string",
"docstr_len": "int64",
"file_path": "string",
"file_type": "string",
"function_names": "string",
"gen_bytes": "int64",
"gen_data_type": "string",
"gen_mode": "string",
"gen_size": "int64",
"gen_valid": "int64",
"height": "int64",
"image_file": "string",
"image_path": "string",
"name": "string",
"num_all_bases": "int64",
"num_bases": "int64",
"num_classes": "int64",
"num_functions": "int64",
"num_imports": "int64",
"num_methods": "int64",
"prompts": "string",
"raises": "string",
"raises_len": "float64",
"recsize": "int64",
"repo": "string",
"returns": "string",
"returns_len": "float64",
"size": "int64",
"src_object": "string",
"total_objects": "int64",
"usage": "string",
"usages": "string",
"width": "int64"
}
```
#### Deserializing a Class or Function in a Row
Note: there is a custom delimiter: ``" __LOKCDR__ "`` for preserving class method and global function ordering within the same sample row. For example, when viewing the class by method names you can use:
```
class_method_names = method_names.split(" __LOKCDR__ ")
code_per_method = code.split(" __LOKCDR__ ")
args_per_method = args.split(" __LOKCDR__ ")
raises_per_method = raises.split(" __LOKCDR__ ")
returns_per_method = returns.split(" __LOKCDR__ ")
```
The returned lists in the example above are ordered by class member method name with associated code, method arguments, raised exceptions and return statements.
## Legacy Datasets
### Legacy Coding Instruction datasets
This dataset was created around 2024-01-10 from 1159 python source code repositories (~142 GB on disk). While the datasets are still available they are no longer supported due to an issue with duplication in the class rows. Note: the rows for global functions, base class inheritance/polymorphism, and module imports were not impacted by this issue.
Here's how to extract the sub datasets within any of the coding instruction parquet files:
```python
df = pd.read_parquet("./files/lok-FILENAME")
functions_df = df[(df["src_object"] == "func")]
bases_df = df[(df["src_object"] == "base")]
imports_df = df[(df["src_object"] == "import")]
```
Dataset Name | Rows | Size (GB)
---|---|---
[Instruct v2 - Building an AI copilot to leverage AI research](https://huggingface.co/datasets/matlok/python-text-copilot-training-instruct-ai-research) | 2.32m | **27.6**
[Instruct v1 - Prototype](https://huggingface.co/datasets/matlok/python-text-copilot-training-instruct) | 1.74m | **28.4**
### Legacy Image datasets
Dataset Name | Rows | Size (GB)
---|---|---
[How to use class methods](://huggingface.co/datasets/matlok/python-image-copilot-training-using-class-knowledge-graphs) | 312k | **304**
### Legacy Audio datasets
Dataset Name | Duration | Rows | Size (GB)
---|---|---|---
[How to use class methods](https://huggingface.co/datasets/matlok/python-audio-copilot-training-using-class-knowledge-graphs) | ~331 days | 211k | **191**
### What was the process for collecting the source code datasets?
The coding datasets are built for extracting the latest updates from many source code repositories. It takes about seven hours to regenerate the large code dataset. Most of the image datasets were generated from the large index dataset, and the audio datasets were mostly generated from the small coding dataset.
### Source Code Background
What source code repositories are in here?
- Python repositories: 1207
- Source repos size on disk: 144.5 GB
- Rows: 2350782
- Python classes: 176237
### Did AI or ML create any of this data?
No. The focus on this dataset was to build and share a large, clean training dataset without using AI or ML models. These datasets were collected without pre-trained AI/ML models performing: summarization, classification, segmentation, vocalization, image rendering, coding, or testing.
## License
These are early-days educational datasets. We do not claim ownership or validity of the code in the datasets in here. The instruction: text instruction, image and audio datasets are creative derivative works, but we are not lawyers so use them at your own discretion. By using these datasets, you acknowledge these risks and take ownership after the files are downloaded.
## Thanks for reading, listening and your time
<img src="https://raw.githubusercontent.com/matlok-ai/python-copilot-image-and-audio-examples/main/static/lok-1-python.jpg" alt="Thanks for reading, listening and your time" width="500" style="display: block; margin: auto;"/>
| matlok/multimodal-python-copilot-training-overview | [
"task_categories:text-generation",
"task_categories:text-to-audio",
"task_categories:text-to-speech",
"task_categories:text-to-image",
"task_categories:audio-to-audio",
"task_categories:image-to-image",
"task_categories:question-answering",
"task_ids:parsing",
"size_categories:10K<n<100K",
"size_categories:100K<n<1M",
"size_categories:1M<n<10M",
"license:other",
"python-copilot",
"python-coding",
"python-architecture",
"knowledge-graphs",
"multimodal",
"text-image-audio",
"fine-tuning",
"training",
"question-answering",
"image-knowledge-graph",
"alpaca",
"mp3",
"png",
"text",
"instruct",
"class",
"classes",
"function",
"functions",
"inheritance",
"region:us"
] | 2024-01-23T17:19:26+00:00 | {"license": ["other"], "size_categories": ["10K<n<100K", "100K<n<1M", "1M<n<10M"], "task_categories": ["text-generation", "text-to-audio", "text-to-speech", "text-to-image", "audio-to-audio", "image-to-image", "question-answering"], "task_ids": ["parsing"], "pretty_name": "multimodal python copilot training overview", "tags": ["python-copilot", "python-coding", "python-architecture", "knowledge-graphs", "multimodal", "text-image-audio", "fine-tuning", "training", "question-answering", "image-knowledge-graph", "alpaca", "mp3", "png", "text", "instruct", "class", "classes", "function", "functions", "inheritance"]} | 2024-02-04T03:20:09+00:00 | [] | [] | TAGS
#task_categories-text-generation #task_categories-text-to-audio #task_categories-text-to-speech #task_categories-text-to-image #task_categories-audio-to-audio #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #size_categories-100K<n<1M #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #function #functions #inheritance #region-us
| Multimodal Datasets for Training Python Copilots from Source Code Analysis
==========================================================================
<img src="URL alt="Multimodal Datasets for Training Python Copilots from Source Code Analysis" width="500" style="display: block; margin: auto;"/>
Welcome to the matlok multimodal python copilot training datasets. This is an overview for our training and fine-tuning datasets found below:
* ~2.3M unique source coding rows
* 1.1M+ instruct alpaca yaml text rows updated bi-weekly
* ~923K png knowledge graph images with alpaca text description
* ~334K mp3s over ~2 years of continuous audio playtime
* requires 1.5 TB storage on disk
Please reach out if you find an issue or want help with a similar dataset. We want to make it easier to create and share large datasets:
hello@URL
Source Code Datasets
--------------------
The source code datasets used AST to extract all classes, functions, base classes, imports, and source code details from 1258 github repos spanning: ai, ml, compute, infrastructure, and architecture.
* Python source code size on disk (all repos): 146.8 GB
* Number of python files: 283637
The small dataset is what we use for development and keeping up with the latest repos we are learning.
Dataset Name: Small, Rows: 514k, Size: 674 MB
Dataset Name: Large, Rows: 2.35m, Size: 3.1 GB
Text - Instruct Python Copilot Knowledge Graph Alpaca Datasets
--------------------------------------------------------------
With the source code dataset we built the code instruct dataset. Each row contains a training question and answer in alpaca serialized as yaml.
Dataset Name: 2024-02-03 - AI Python Coding Instructions, Rows: 1.18m, Size (GB): 2.1
Dataset Name: 2024-01-27 - AI Python Coding Instructions, Rows: 1.05m, Size (GB): 1.9
Image - Instruct Python Copilot Knowledge Graph Alpaca Datasets
---------------------------------------------------------------
Each row in the image parquet dataset corresponds to a directed knowledge graph saved as a png file. The png file, located in the dbytes column, incorporates a descriptive explanation text box written in Alpaca format describing the image using identifiers. The size of the png file is indicated by the dbytes\_len column. Use the file\_path column to trace a png back to the original source code repository file.
Dataset Name: How to use class methods, Rows: 312k, Size (GB): 294
Dataset Name: How to set up class inheritance, Rows: 260k, Size (GB): 135
Dataset Name: How to use global functions, Rows: 134k, Size (GB): 130
Dataset Name: How to import modules, Rows: 217k, Size (GB): 211
Audio - Instruct Python Copilot Knowledge Graph Alpaca Datasets
---------------------------------------------------------------
Each row in the audio parquet dataset contains one narrated Alpaca question or answer, stored as an MP3 file in the dbytes column, with its size specified in the dbytes\_len column. Use the file\_path column to trace an mp3 back to the original source code repository file.
What is in these datasets?
--------------------------
### Image Training Examples
These graphs are focused on a high-level overview of how to use python:
* classes
* base classes
* global functions
* imports
Each graph includes labeled objects, directionality, standardized colors, and a descriptive text box for all drawn objects.
Below are some extracted image samples:
#### Class - Knowledge Graph Images
Here are samples from the python copilot class image knowledge graph dataset (294.1 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/configuration\_clip.py CLIPConfig class
<img src="URL alt="How to use the transformers/src/transformers/models/clip/configuration\_clip.py CLIPConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/configuration\_clip.py CLIPOnnxConfig class
<img src="URL alt="How to use the transformers/src/transformers/models/clip/configuration\_clip.py CLIPOnnxConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization\_clip.py CLIPTokenizer class
<img src="URL alt="How to use the transformers/src/transformers/models/clip/tokenization\_clip.py CLIPTokenizer class" width="500" style="display: block; margin: auto;"/>
#### Base Class - Inheritance and Polymorphism Knowledge Graph Images
Here are samples from the python copilot base class inheritance and polymorphism image knowledge graph dataset (135 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/configuration\_clip.py CLIPConfig inherited base class(es)
<img src="URL alt="How to use the transformers/src/transformers/models/clip/configuration\_clip.py CLIPConfig inherited base class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization\_clip\_fast.py CLIPTokenizerFast inherited base class(es)
<img src="URL alt="How to use the transformers/src/transformers/models/clip/tokenization\_clip\_fast.py CLIPTokenizerFast inherited base class" width="500" style="display: block; margin: auto;"/>
#### Global Functions - Knowledge Graph Images
Here are samples from the python copilot global functions image knowledge graph dataset (130 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/convert\_clip\_original\_pytorch\_to\_hf.py global functions
<img src="URL alt="How to use the transformers/src/transformers/models/clip/convert\_clip\_original\_pytorch\_to\_hf.py global functions" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization\_clip.py global functions
<img src="URL alt="How to use the transformers/src/transformers/models/clip/tokenization\_clip.py global functions" width="500" style="display: block; margin: auto;"/>
#### Imports - Knowledge Graph Images
Here are samples from the python copilot imports image knowledge graph dataset (211 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:
##### How to use the transformers/src/transformers/models/clip/configuration\_clip.py imports like the CLIPConfig class
<img src="URL alt="How to use the transformers/src/transformers/models/clip/configuration\_clip.py imports like the CLIPConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/configuration\_clip.py imports like the CLIPTextConfig class
<img src="URL alt="How to use the transformers/src/transformers/models/clip/configuration\_clip.py imports like the CLIPTextConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/configuration\_clip.py imports like the CLIPVisionConfig class
<img src="URL alt="How to use the transformers/src/transformers/models/clip/configuration\_clip.py imports like the CLIPVisionConfig class" width="500" style="display: block; margin: auto;"/>
##### How to use the transformers/src/transformers/models/clip/tokenization\_clip\_fast.py imports like the CLIPTokenizerFast class
<img src="URL alt="How to use the transformers/src/transformers/models/clip/tokenization\_clip\_fast.py imports like the CLIPTokenizerFast class" width="500" style="display: block; margin: auto;"/>
### Audio Training Examples - Question and Answering in Alpaca
Below are extracted question and answer mp3 samples. Each mp3 is either a recording of the alpaca question or answer. Question mp3s use a different speaker than the answer mp3 voice.
Note: mobile browsers have issues playing the mp3s and show a question mark due to markdown failing to show the Listen link vs a confusing ? mark icon sorry!
Schema High Level Design
------------------------
### Summary
We tried to build the schema to help others maximize available hardware (cpu/storage).
### Background
We use a lot of python multiprocessing pools to concurrently search many parquet files in each of our datasets at once. To help others do the same, we included the recsize and desc\_len columns that provide an estimate for "how long" each row will take to: draw as an image or record as an mp3. With these columns, we are able to maximize our hardware because each worker in the python pool is hacking on a task that is "about the same level of effort" as all the other workers at any given time. With these estimated length columns, we can start using these datasets faster than if we were using a single python process.
### Overview
To find the alpaca training text data, please refer to the desc column.
Here's a breakdown of some of the more useful columns:
* file\_path - identifier for all datasets source code module path
* desc - alpaca question and answer yaml
* desc\_len - length of the alpaca question and answer yaml
* recsize - estimated compute time/size of the row for downstream pools
* name - name of the file
* class\_name - name of the class and global if function
* class\_bases - comma delimited base class name(s)
* is\_member - bool for is a class member or global function
* class\_docstr - class docstring
* class\_docstr\_tok - tokenized class docstring
* docstr - docsting for the method or function
* docstr\_tok - tokenized method or function docstring
* code\_tok - tokenized code
* lstart - start line number
* lend - end line number
* code - ''" **LOKCDR** "'' delimited code ordered by class method or global function
* args - ''" **LOKCDR** "'' delimited arguments ordered by class method or global function
* returns - ''" **LOKCDR** "'' delimited returns ordered by class method or global function
* raises - ''" **LOKCDR** "'' delimited exception raises ordered by class method or global function
* method\_names - ''" **LOKCDR** "'' delimited code ordered by class method name
* function\_names - ''" **LOKCDR** "'' delimited code ordered by global function name
* imports - ordered imports in the module
* filename - name of the file without the directory pathing
* total\_objects - total objects detected in the file\_path
* num\_classes - number of classes detected in the file\_path
* num\_methods - number of class methods detected in the file\_path
* num\_bases - number of base classes for the class\_name definition
* num\_all\_bases - number of all base classes detected in the file\_path
* num\_functions - number of global functions detected in the file\_path
* num\_imports - number of imports detected in the file\_path
* label\_id - shortened tracking label for image knowledge graphs and mp3s
### Reference Schema - All Data Types
Not all columns are casted to the correct types, here is the reference schema when joining all datasets together:
#### Deserializing a Class or Function in a Row
Note: there is a custom delimiter: ''" **LOKCDR** "'' for preserving class method and global function ordering within the same sample row. For example, when viewing the class by method names you can use:
The returned lists in the example above are ordered by class member method name with associated code, method arguments, raised exceptions and return statements.
Legacy Datasets
---------------
### Legacy Coding Instruction datasets
This dataset was created around 2024-01-10 from 1159 python source code repositories (~142 GB on disk). While the datasets are still available they are no longer supported due to an issue with duplication in the class rows. Note: the rows for global functions, base class inheritance/polymorphism, and module imports were not impacted by this issue.
Here's how to extract the sub datasets within any of the coding instruction parquet files:
Dataset Name: Instruct v2 - Building an AI copilot to leverage AI research, Rows: 2.32m, Size (GB): 27.6
Dataset Name: Instruct v1 - Prototype, Rows: 1.74m, Size (GB): 28.4
### Legacy Image datasets
Dataset Name: How to use class methods, Rows: 312k, Size (GB): 304
### Legacy Audio datasets
### What was the process for collecting the source code datasets?
The coding datasets are built for extracting the latest updates from many source code repositories. It takes about seven hours to regenerate the large code dataset. Most of the image datasets were generated from the large index dataset, and the audio datasets were mostly generated from the small coding dataset.
### Source Code Background
What source code repositories are in here?
* Python repositories: 1207
* Source repos size on disk: 144.5 GB
* Rows: 2350782
* Python classes: 176237
### Did AI or ML create any of this data?
No. The focus on this dataset was to build and share a large, clean training dataset without using AI or ML models. These datasets were collected without pre-trained AI/ML models performing: summarization, classification, segmentation, vocalization, image rendering, coding, or testing.
License
-------
These are early-days educational datasets. We do not claim ownership or validity of the code in the datasets in here. The instruction: text instruction, image and audio datasets are creative derivative works, but we are not lawyers so use them at your own discretion. By using these datasets, you acknowledge these risks and take ownership after the files are downloaded.
Thanks for reading, listening and your time
-------------------------------------------
<img src="URL alt="Thanks for reading, listening and your time" width="500" style="display: block; margin: auto;"/>
| [
"### Image Training Examples\n\n\nThese graphs are focused on a high-level overview of how to use python:\n\n\n* classes\n* base classes\n* global functions\n* imports\n\n\nEach graph includes labeled objects, directionality, standardized colors, and a descriptive text box for all drawn objects.\n\n\nBelow are some extracted image samples:",
"#### Class - Knowledge Graph Images\n\n\nHere are samples from the python copilot class image knowledge graph dataset (294.1 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPOnnxConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPOnnxConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py CLIPTokenizer class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py CLIPTokenizer class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"#### Base Class - Inheritance and Polymorphism Knowledge Graph Images\n\n\nHere are samples from the python copilot base class inheritance and polymorphism image knowledge graph dataset (135 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig inherited base class(es)\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig inherited base class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py CLIPTokenizerFast inherited base class(es)\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py CLIPTokenizerFast inherited base class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"#### Global Functions - Knowledge Graph Images\n\n\nHere are samples from the python copilot global functions image knowledge graph dataset (130 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/convert\\_clip\\_original\\_pytorch\\_to\\_hf.py global functions\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/convert\\_clip\\_original\\_pytorch\\_to\\_hf.py global functions\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py global functions\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py global functions\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"#### Imports - Knowledge Graph Images\n\n\nHere are samples from the python copilot imports image knowledge graph dataset (211 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPTextConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPTextConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPVisionConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPVisionConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py imports like the CLIPTokenizerFast class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py imports like the CLIPTokenizerFast class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"### Audio Training Examples - Question and Answering in Alpaca\n\n\nBelow are extracted question and answer mp3 samples. Each mp3 is either a recording of the alpaca question or answer. Question mp3s use a different speaker than the answer mp3 voice.\n\n\nNote: mobile browsers have issues playing the mp3s and show a question mark due to markdown failing to show the Listen link vs a confusing ? mark icon sorry!\n\n\n\nSchema High Level Design\n------------------------",
"### Summary\n\n\nWe tried to build the schema to help others maximize available hardware (cpu/storage).",
"### Background\n\n\nWe use a lot of python multiprocessing pools to concurrently search many parquet files in each of our datasets at once. To help others do the same, we included the recsize and desc\\_len columns that provide an estimate for \"how long\" each row will take to: draw as an image or record as an mp3. With these columns, we are able to maximize our hardware because each worker in the python pool is hacking on a task that is \"about the same level of effort\" as all the other workers at any given time. With these estimated length columns, we can start using these datasets faster than if we were using a single python process.",
"### Overview\n\n\nTo find the alpaca training text data, please refer to the desc column.\n\n\nHere's a breakdown of some of the more useful columns:\n\n\n* file\\_path - identifier for all datasets source code module path\n* desc - alpaca question and answer yaml\n* desc\\_len - length of the alpaca question and answer yaml\n* recsize - estimated compute time/size of the row for downstream pools\n* name - name of the file\n* class\\_name - name of the class and global if function\n* class\\_bases - comma delimited base class name(s)\n* is\\_member - bool for is a class member or global function\n* class\\_docstr - class docstring\n* class\\_docstr\\_tok - tokenized class docstring\n* docstr - docsting for the method or function\n* docstr\\_tok - tokenized method or function docstring\n* code\\_tok - tokenized code\n* lstart - start line number\n* lend - end line number\n* code - ''\" **LOKCDR** \"'' delimited code ordered by class method or global function\n* args - ''\" **LOKCDR** \"'' delimited arguments ordered by class method or global function\n* returns - ''\" **LOKCDR** \"'' delimited returns ordered by class method or global function\n* raises - ''\" **LOKCDR** \"'' delimited exception raises ordered by class method or global function\n* method\\_names - ''\" **LOKCDR** \"'' delimited code ordered by class method name\n* function\\_names - ''\" **LOKCDR** \"'' delimited code ordered by global function name\n* imports - ordered imports in the module\n* filename - name of the file without the directory pathing\n* total\\_objects - total objects detected in the file\\_path\n* num\\_classes - number of classes detected in the file\\_path\n* num\\_methods - number of class methods detected in the file\\_path\n* num\\_bases - number of base classes for the class\\_name definition\n* num\\_all\\_bases - number of all base classes detected in the file\\_path\n* num\\_functions - number of global functions detected in the file\\_path\n* num\\_imports - number of imports detected in the file\\_path\n* label\\_id - shortened tracking label for image knowledge graphs and mp3s",
"### Reference Schema - All Data Types\n\n\nNot all columns are casted to the correct types, here is the reference schema when joining all datasets together:",
"#### Deserializing a Class or Function in a Row\n\n\nNote: there is a custom delimiter: ''\" **LOKCDR** \"'' for preserving class method and global function ordering within the same sample row. For example, when viewing the class by method names you can use:\n\n\nThe returned lists in the example above are ordered by class member method name with associated code, method arguments, raised exceptions and return statements.\n\n\nLegacy Datasets\n---------------",
"### Legacy Coding Instruction datasets\n\n\nThis dataset was created around 2024-01-10 from 1159 python source code repositories (~142 GB on disk). While the datasets are still available they are no longer supported due to an issue with duplication in the class rows. Note: the rows for global functions, base class inheritance/polymorphism, and module imports were not impacted by this issue.\n\n\nHere's how to extract the sub datasets within any of the coding instruction parquet files:\n\n\nDataset Name: Instruct v2 - Building an AI copilot to leverage AI research, Rows: 2.32m, Size (GB): 27.6\nDataset Name: Instruct v1 - Prototype, Rows: 1.74m, Size (GB): 28.4",
"### Legacy Image datasets\n\n\nDataset Name: How to use class methods, Rows: 312k, Size (GB): 304",
"### Legacy Audio datasets",
"### What was the process for collecting the source code datasets?\n\n\nThe coding datasets are built for extracting the latest updates from many source code repositories. It takes about seven hours to regenerate the large code dataset. Most of the image datasets were generated from the large index dataset, and the audio datasets were mostly generated from the small coding dataset.",
"### Source Code Background\n\n\nWhat source code repositories are in here?\n\n\n* Python repositories: 1207\n* Source repos size on disk: 144.5 GB\n* Rows: 2350782\n* Python classes: 176237",
"### Did AI or ML create any of this data?\n\n\nNo. The focus on this dataset was to build and share a large, clean training dataset without using AI or ML models. These datasets were collected without pre-trained AI/ML models performing: summarization, classification, segmentation, vocalization, image rendering, coding, or testing.\n\n\nLicense\n-------\n\n\nThese are early-days educational datasets. We do not claim ownership or validity of the code in the datasets in here. The instruction: text instruction, image and audio datasets are creative derivative works, but we are not lawyers so use them at your own discretion. By using these datasets, you acknowledge these risks and take ownership after the files are downloaded.\n\n\nThanks for reading, listening and your time\n-------------------------------------------\n\n\n<img src=\"URL alt=\"Thanks for reading, listening and your time\" width=\"500\" style=\"display: block; margin: auto;\"/>"
] | [
"TAGS\n#task_categories-text-generation #task_categories-text-to-audio #task_categories-text-to-speech #task_categories-text-to-image #task_categories-audio-to-audio #task_categories-image-to-image #task_categories-question-answering #task_ids-parsing #size_categories-10K<n<100K #size_categories-100K<n<1M #size_categories-1M<n<10M #license-other #python-copilot #python-coding #python-architecture #knowledge-graphs #multimodal #text-image-audio #fine-tuning #training #question-answering #image-knowledge-graph #alpaca #mp3 #png #text #instruct #class #classes #function #functions #inheritance #region-us \n",
"### Image Training Examples\n\n\nThese graphs are focused on a high-level overview of how to use python:\n\n\n* classes\n* base classes\n* global functions\n* imports\n\n\nEach graph includes labeled objects, directionality, standardized colors, and a descriptive text box for all drawn objects.\n\n\nBelow are some extracted image samples:",
"#### Class - Knowledge Graph Images\n\n\nHere are samples from the python copilot class image knowledge graph dataset (294.1 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPOnnxConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPOnnxConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py CLIPTokenizer class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py CLIPTokenizer class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"#### Base Class - Inheritance and Polymorphism Knowledge Graph Images\n\n\nHere are samples from the python copilot base class inheritance and polymorphism image knowledge graph dataset (135 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig inherited base class(es)\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py CLIPConfig inherited base class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py CLIPTokenizerFast inherited base class(es)\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py CLIPTokenizerFast inherited base class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"#### Global Functions - Knowledge Graph Images\n\n\nHere are samples from the python copilot global functions image knowledge graph dataset (130 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/convert\\_clip\\_original\\_pytorch\\_to\\_hf.py global functions\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/convert\\_clip\\_original\\_pytorch\\_to\\_hf.py global functions\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py global functions\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip.py global functions\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"#### Imports - Knowledge Graph Images\n\n\nHere are samples from the python copilot imports image knowledge graph dataset (211 GB). These images attempt to teach how to use software with a networkx graph saved as a png with an alpaca text box:",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPTextConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPTextConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPVisionConfig class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/configuration\\_clip.py imports like the CLIPVisionConfig class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"##### How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py imports like the CLIPTokenizerFast class\n\n\n<img src=\"URL alt=\"How to use the transformers/src/transformers/models/clip/tokenization\\_clip\\_fast.py imports like the CLIPTokenizerFast class\" width=\"500\" style=\"display: block; margin: auto;\"/>",
"### Audio Training Examples - Question and Answering in Alpaca\n\n\nBelow are extracted question and answer mp3 samples. Each mp3 is either a recording of the alpaca question or answer. Question mp3s use a different speaker than the answer mp3 voice.\n\n\nNote: mobile browsers have issues playing the mp3s and show a question mark due to markdown failing to show the Listen link vs a confusing ? mark icon sorry!\n\n\n\nSchema High Level Design\n------------------------",
"### Summary\n\n\nWe tried to build the schema to help others maximize available hardware (cpu/storage).",
"### Background\n\n\nWe use a lot of python multiprocessing pools to concurrently search many parquet files in each of our datasets at once. To help others do the same, we included the recsize and desc\\_len columns that provide an estimate for \"how long\" each row will take to: draw as an image or record as an mp3. With these columns, we are able to maximize our hardware because each worker in the python pool is hacking on a task that is \"about the same level of effort\" as all the other workers at any given time. With these estimated length columns, we can start using these datasets faster than if we were using a single python process.",
"### Overview\n\n\nTo find the alpaca training text data, please refer to the desc column.\n\n\nHere's a breakdown of some of the more useful columns:\n\n\n* file\\_path - identifier for all datasets source code module path\n* desc - alpaca question and answer yaml\n* desc\\_len - length of the alpaca question and answer yaml\n* recsize - estimated compute time/size of the row for downstream pools\n* name - name of the file\n* class\\_name - name of the class and global if function\n* class\\_bases - comma delimited base class name(s)\n* is\\_member - bool for is a class member or global function\n* class\\_docstr - class docstring\n* class\\_docstr\\_tok - tokenized class docstring\n* docstr - docsting for the method or function\n* docstr\\_tok - tokenized method or function docstring\n* code\\_tok - tokenized code\n* lstart - start line number\n* lend - end line number\n* code - ''\" **LOKCDR** \"'' delimited code ordered by class method or global function\n* args - ''\" **LOKCDR** \"'' delimited arguments ordered by class method or global function\n* returns - ''\" **LOKCDR** \"'' delimited returns ordered by class method or global function\n* raises - ''\" **LOKCDR** \"'' delimited exception raises ordered by class method or global function\n* method\\_names - ''\" **LOKCDR** \"'' delimited code ordered by class method name\n* function\\_names - ''\" **LOKCDR** \"'' delimited code ordered by global function name\n* imports - ordered imports in the module\n* filename - name of the file without the directory pathing\n* total\\_objects - total objects detected in the file\\_path\n* num\\_classes - number of classes detected in the file\\_path\n* num\\_methods - number of class methods detected in the file\\_path\n* num\\_bases - number of base classes for the class\\_name definition\n* num\\_all\\_bases - number of all base classes detected in the file\\_path\n* num\\_functions - number of global functions detected in the file\\_path\n* num\\_imports - number of imports detected in the file\\_path\n* label\\_id - shortened tracking label for image knowledge graphs and mp3s",
"### Reference Schema - All Data Types\n\n\nNot all columns are casted to the correct types, here is the reference schema when joining all datasets together:",
"#### Deserializing a Class or Function in a Row\n\n\nNote: there is a custom delimiter: ''\" **LOKCDR** \"'' for preserving class method and global function ordering within the same sample row. For example, when viewing the class by method names you can use:\n\n\nThe returned lists in the example above are ordered by class member method name with associated code, method arguments, raised exceptions and return statements.\n\n\nLegacy Datasets\n---------------",
"### Legacy Coding Instruction datasets\n\n\nThis dataset was created around 2024-01-10 from 1159 python source code repositories (~142 GB on disk). While the datasets are still available they are no longer supported due to an issue with duplication in the class rows. Note: the rows for global functions, base class inheritance/polymorphism, and module imports were not impacted by this issue.\n\n\nHere's how to extract the sub datasets within any of the coding instruction parquet files:\n\n\nDataset Name: Instruct v2 - Building an AI copilot to leverage AI research, Rows: 2.32m, Size (GB): 27.6\nDataset Name: Instruct v1 - Prototype, Rows: 1.74m, Size (GB): 28.4",
"### Legacy Image datasets\n\n\nDataset Name: How to use class methods, Rows: 312k, Size (GB): 304",
"### Legacy Audio datasets",
"### What was the process for collecting the source code datasets?\n\n\nThe coding datasets are built for extracting the latest updates from many source code repositories. It takes about seven hours to regenerate the large code dataset. Most of the image datasets were generated from the large index dataset, and the audio datasets were mostly generated from the small coding dataset.",
"### Source Code Background\n\n\nWhat source code repositories are in here?\n\n\n* Python repositories: 1207\n* Source repos size on disk: 144.5 GB\n* Rows: 2350782\n* Python classes: 176237",
"### Did AI or ML create any of this data?\n\n\nNo. The focus on this dataset was to build and share a large, clean training dataset without using AI or ML models. These datasets were collected without pre-trained AI/ML models performing: summarization, classification, segmentation, vocalization, image rendering, coding, or testing.\n\n\nLicense\n-------\n\n\nThese are early-days educational datasets. We do not claim ownership or validity of the code in the datasets in here. The instruction: text instruction, image and audio datasets are creative derivative works, but we are not lawyers so use them at your own discretion. By using these datasets, you acknowledge these risks and take ownership after the files are downloaded.\n\n\nThanks for reading, listening and your time\n-------------------------------------------\n\n\n<img src=\"URL alt=\"Thanks for reading, listening and your time\" width=\"500\" style=\"display: block; margin: auto;\"/>"
] |
614935b2507cad26ed62000f25d95746fb617492 |
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T17:27:53.153945](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed3/blob/main/results_2024-01-23T17-27-53.153945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501939026133468,
"acc_stderr": 0.03218788319610944,
"acc_norm": 0.6495560952324442,
"acc_norm_stderr": 0.032858464838240266,
"mc1": 0.5997552019583844,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.7231441421880092,
"mc2_stderr": 0.014899035237604321
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.742320819112628,
"acc_norm_stderr": 0.012780770562768398
},
"harness|hellaswag|10": {
"acc": 0.7395937064329815,
"acc_stderr": 0.004379594059141035,
"acc_norm": 0.8927504481179048,
"acc_norm_stderr": 0.0030879787141283683
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091826,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091826
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623553,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5997552019583844,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.7231441421880092,
"mc2_stderr": 0.014899035237604321
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873497
},
"harness|gsm8k|5": {
"acc": 0.6747536012130402,
"acc_stderr": 0.012903904752543919
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed3 | [
"region:us"
] | 2024-01-23T17:30:13+00:00 | {"pretty_name": "Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3", "dataset_summary": "Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T17:27:53.153945](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-dpo-ed3/blob/main/results_2024-01-23T17-27-53.153945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501939026133468,\n \"acc_stderr\": 0.03218788319610944,\n \"acc_norm\": 0.6495560952324442,\n \"acc_norm_stderr\": 0.032858464838240266,\n \"mc1\": 0.5997552019583844,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.7231441421880092,\n \"mc2_stderr\": 0.014899035237604321\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.742320819112628,\n \"acc_norm_stderr\": 0.012780770562768398\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7395937064329815,\n \"acc_stderr\": 0.004379594059141035,\n \"acc_norm\": 0.8927504481179048,\n \"acc_norm_stderr\": 0.0030879787141283683\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5997552019583844,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.7231441421880092,\n \"mc2_stderr\": 0.014899035237604321\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873497\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6747536012130402,\n \"acc_stderr\": 0.012903904752543919\n }\n}\n```", "repo_url": "https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-27-53.153945.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["**/details_harness|winogrande|5_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T17-27-53.153945.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T17_27_53.153945", "path": ["results_2024-01-23T17-27-53.153945.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T17-27-53.153945.parquet"]}]}]} | 2024-01-23T17:30:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3
Dataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T17:27:53.153945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:27:53.153945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3\n\n\n\nDataset automatically created during the evaluation run of model YouKnowMee/Mistral-7b-instruct-v0.2-summ-dpo-ed3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:27:53.153945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b077b0cc355e1f9ddc59757bf20bad6753dc5e15 | # About
This dataset consist 560 Sprite animations in form of single image paired with meaningful description, with consistent gray background.
# Credits
Special thanks to the skilled sprite animation creators, contributing to the training dataset for this project.
- Train images [0.png](images/0.png) - [6.png](images/6.png) thanks to https://oisougabo.itch.io/gap-i
- Train images [7.png](images/7.png) - [21.png](images/21.png) thanks to https://szadiart.itch.io/2d-soulslike-character
- Train images [22.png](images/22.png) - [29.png](images/29.png) thanks to https://admurin.itch.io/mega-admurins-freebies
- Train images [30.png](images/30.png) - [37.png](images/37.png) thanks to https://astrobob.itch.io/arcane-archer
- Train images [38.png](images/38.png) - [43.png](images/43.png) thanks to https://penusbmic.itch.io/sci-fi-character-pack-10
- Train images [44.png](images/44.png) - [44.png](images/44.png) thanks to https://creativekind.itch.io/gif-bloodmoon-tower-free
- Train images [45.png](images/45.png) - [51.png](images/51.png) thanks to https://clembod.itch.io/bringer-of-death-free
- Train images [52.png](images/52.png) - [71.png](images/71.png) thanks to https://admurin.itch.io/mega-admurins-freebies
- Train images [72.png](images/72.png) - [97.png](images/97.png) thanks to https://assetbakery.itch.io/2d-fighter-3
- Train images [98.png](images/98.png) - [102.png](images/102.png) thanks to https://ansimuz.itch.io/dancing-girl-sprites
- Train images [103.png](images/103.png) - [126.png](images/126.png) thanks to https://chierit.itch.io/elementals-leaf-ranger
- Train images [127.png](images/127.png) - [141.png](images/141.png) thanks to https://chierit.itch.io/elementals-fire-knight
- Train images [142.png](images/142.png) - [157.png](images/157.png) thanks to https://chierit.itch.io/elementals-water-priestess
- Train images [158.png](images/158.png) - [162.png](images/162.png) thanks to https://luizmelo.itch.io/evil-wizard
- Train images [163.png](images/163.png) - [167.png](images/167.png) thanks to https://penusbmic.itch.io/monster-pack-i
- Train images [168.png](images/168.png) - [169.png](images/169.png) thanks to https://foozlecc.itch.io/void-environment-pack
- Train images [170.png](images/170.png) - [175.png](images/175.png) thanks to https://xyezawr.itch.io/gif-free-pixel-effects-pack-6-forks-of-flame
- Train images [176.png](images/176.png) - [183.png](images/183.png) thanks to https://luizmelo.itch.io/hero-knight-2
- Train images [184.png](images/184.png) - [191.png](images/191.png) thanks to https://luizmelo.itch.io/hero-knight
- Train images [192.png](images/192.png) - [198.png](images/198.png) thanks to https://luizmelo.itch.io/huntress-2
- Train images [199.png](images/199.png) - [208.png](images/208.png) thanks to https://luizmelo.itch.io/huntress
- Train images [209.png](images/209.png) - [216.png](images/216.png) thanks to https://luizmelo.itch.io/martial-hero-2
- Train images [217.png](images/217.png) - [225.png](images/225.png) thanks to https://luizmelo.itch.io/martial-hero-3
- Train images [226.png](images/226.png) - [233.png](images/233.png) thanks to https://luizmelo.itch.io/martial-hero
- Train images [234.png](images/234.png) - [242.png](images/242.png) thanks to https://luizmelo.itch.io/medieval-king-pack-2
- Train images [243.png](images/243.png) - [252.png](images/252.png) thanks to https://luizmelo.itch.io/medieval-warrior-pack-2
- Train images [253.png](images/253.png) - [261.png](images/261.png) thanks to https://luizmelo.itch.io/medieval-warrior-pack-3
- Train images [262.png](images/262.png) - [278.png](images/278.png) thanks to https://admurin.itch.io/pixel-character-horse-rider
- Train images [279.png](images/279.png) - [279.png](images/279.png) thanks to https://mattwalkden.itch.io/free-robot-warfare-pack
- Train images [280.png](images/280.png) - [294.png](images/294.png) thanks to https://szadiart.itch.io/rocky-world-platformer-set
- Train images [295.png](images/295.png) - [298.png](images/298.png) thanks to https://penusbmic.itch.io/characterpack1
- Train images [299.png](images/299.png) - [302.png](images/302.png) thanks to https://penusbmic.itch.io/monster-pack-i
- Train images [303.png](images/303.png) - [311.png](images/311.png) thanks to https://darkpixel-kronovi.itch.io/undead-executioner
- Train images [312.png](images/312.png) - [319.png](images/319.png) thanks to https://luizmelo.itch.io/wizard-pack
- Train images [320.png](images/320.png) - [324.png](images/324.png) thanks to https://chierit.itch.io/boss-demon-slime
- Train images [325.png](images/325.png) - [384.png](images/384.png) thanks to https://scrabling.itch.io/pixel-isometric-tiles
- Train images [385.png](images/385.png) - [389.png](images/389.png) thanks to https://rili-xl.itch.io/cultist-priest-pack
- Train images [390.png](images/390.png) - [405.png](images/405.png) thanks to https://arks.itch.io/dino-characters
- Train images [406.png](images/406.png) - [419.png](images/419.png) thanks to https://chierit.itch.io/elementals-leaf-ranger
- Train images [420.png](images/420.png) - [423.png](images/423.png) thanks to https://opengameart.org/content/lpc-maskman
- Train images [424.png](images/424.png) - [428.png](images/428.png) thanks to https://penusbmic.itch.io/monster-pack-i
- Train images [429.png](images/429.png) - [431.png](images/431.png) thanks to https://bdragon1727.itch.io/free-trap-platformer
- Train images [432.png](images/432.png) - [559.png](images/559.png) thanks to https://github.com/YingzhenLi/Sprites | pawkanarek/spraix_1024 | [
"task_categories:text-classification",
"size_categories:n<1K",
"language:en",
"license:gpl-3.0",
"art",
"region:us"
] | 2024-01-23T17:32:26+00:00 | {"language": ["en"], "license": "gpl-3.0", "size_categories": ["n<1K"], "task_categories": ["text-classification"], "pretty_name": "Spraix base dataset 1024x1024", "tags": ["art"]} | 2024-01-24T10:27:48+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-n<1K #language-English #license-gpl-3.0 #art #region-us
| # About
This dataset consist 560 Sprite animations in form of single image paired with meaningful description, with consistent gray background.
# Credits
Special thanks to the skilled sprite animation creators, contributing to the training dataset for this project.
- Train images 0.png - 6.png thanks to URL
- Train images 7.png - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL
- Train images URL - URL thanks to URL | [
"# About \nThis dataset consist 560 Sprite animations in form of single image paired with meaningful description, with consistent gray background.",
"# Credits\nSpecial thanks to the skilled sprite animation creators, contributing to the training dataset for this project.\n\n- Train images 0.png - 6.png thanks to URL \n- Train images 7.png - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL"
] | [
"TAGS\n#task_categories-text-classification #size_categories-n<1K #language-English #license-gpl-3.0 #art #region-us \n",
"# About \nThis dataset consist 560 Sprite animations in form of single image paired with meaningful description, with consistent gray background.",
"# Credits\nSpecial thanks to the skilled sprite animation creators, contributing to the training dataset for this project.\n\n- Train images 0.png - 6.png thanks to URL \n- Train images 7.png - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL \n- Train images URL - URL thanks to URL"
] |
376b2fbaf9c410b6c804437974223d2e0365f474 |
500 images of woman, to obtain a flexible and polished model for different need
Base model is a mix of 30% SD 1.5 (8Gb) with 70% epicphotogasm_lastUnicorn, with structure defined by lastUnicorn.
This provide all the details of SD, with the strong structures of epicphotogasm
python merge.py "WS" /tmp v1-5-pruned-emaonly.safetensors epicphotogasm_lastUnicorn.safetensors --cosine1 --alpha=0.70
A first bake of 2000 steps using dreambooth, to generate extra tags and provide extra flexibility,
regularizing on a more varied types of women.
A second bake of 2500 steps using ss-script finetuning, to finetune the model to adhere to images.
## Examples
exp_sd_v2


exp_sd_v4
 | halftimecoder/exp_sd | [
"language:en",
"stable diffusion",
"region:us"
] | 2024-01-23T17:41:52+00:00 | {"language": ["en"], "tags": ["stable diffusion"]} | 2024-01-26T23:25:55+00:00 | [] | [
"en"
] | TAGS
#language-English #stable diffusion #region-us
|
500 images of woman, to obtain a flexible and polished model for different need
Base model is a mix of 30% SD 1.5 (8Gb) with 70% epicphotogasm_lastUnicorn, with structure defined by lastUnicorn.
This provide all the details of SD, with the strong structures of epicphotogasm
python URL "WS" /tmp v1-5-pruned-emaonly.safetensors epicphotogasm_lastUnicorn.safetensors --cosine1 --alpha=0.70
A first bake of 2000 steps using dreambooth, to generate extra tags and provide extra flexibility,
regularizing on a more varied types of women.
A second bake of 2500 steps using ss-script finetuning, to finetune the model to adhere to images.
## Examples
exp_sd_v2
!woman in red dress
!Redhead sitting on a chair
exp_sd_v4
!Redhead sitting on a chair | [
"## Examples\n\nexp_sd_v2\n!woman in red dress\n\n!Redhead sitting on a chair\n\nexp_sd_v4\n\n!Redhead sitting on a chair"
] | [
"TAGS\n#language-English #stable diffusion #region-us \n",
"## Examples\n\nexp_sd_v2\n!woman in red dress\n\n!Redhead sitting on a chair\n\nexp_sd_v4\n\n!Redhead sitting on a chair"
] |
37d654fe23e0d3330d952b73ebd74007b5c6e99d |
# Dataset Card for Evaluation run of argilla/DistilabelBeagle14-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [argilla/DistilabelBeagle14-7B](https://huggingface.co/argilla/DistilabelBeagle14-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_argilla__DistilabelBeagle14-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T17:49:14.441859](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__DistilabelBeagle14-7B/blob/main/results_2024-01-23T17-49-14.441859.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6145882827180348,
"acc_stderr": 0.03298061836989478,
"acc_norm": 0.6187554842215415,
"acc_norm_stderr": 0.033661889349411354,
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136473,
"mc2": 0.6890798826156785,
"mc2_stderr": 0.015559307572072172
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.013512058415238361,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.7055367456681936,
"acc_stderr": 0.0045486957496209575,
"acc_norm": 0.8700458076080462,
"acc_norm_stderr": 0.003355658238571492
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969115,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010354,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.02667561192603711,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.02667561192603711
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.01265903323706725,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.01265903323706725
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271758,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304324,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304324
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728665,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728665
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136473,
"mc2": 0.6890798826156785,
"mc2_stderr": 0.015559307572072172
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491906
},
"harness|gsm8k|5": {
"acc": 0.36087945413191813,
"acc_stderr": 0.013228626753925141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_argilla__DistilabelBeagle14-7B | [
"region:us"
] | 2024-01-23T17:51:39+00:00 | {"pretty_name": "Evaluation run of argilla/DistilabelBeagle14-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [argilla/DistilabelBeagle14-7B](https://huggingface.co/argilla/DistilabelBeagle14-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_argilla__DistilabelBeagle14-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T17:49:14.441859](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__DistilabelBeagle14-7B/blob/main/results_2024-01-23T17-49-14.441859.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6145882827180348,\n \"acc_stderr\": 0.03298061836989478,\n \"acc_norm\": 0.6187554842215415,\n \"acc_norm_stderr\": 0.033661889349411354,\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136473,\n \"mc2\": 0.6890798826156785,\n \"mc2_stderr\": 0.015559307572072172\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238361,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7055367456681936,\n \"acc_stderr\": 0.0045486957496209575,\n \"acc_norm\": 0.8700458076080462,\n \"acc_norm_stderr\": 0.003355658238571492\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969115,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010354,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010354\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.02667561192603711,\n \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.02667561192603711\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n \"acc_stderr\": 0.01265903323706725,\n \"acc_norm\": 0.43415906127770537,\n \"acc_norm_stderr\": 0.01265903323706725\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271758,\n \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271758\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304324,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304324\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n \"acc_stderr\": 0.028748298931728665,\n \"acc_norm\": 0.7910447761194029,\n \"acc_norm_stderr\": 0.028748298931728665\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.032467217651178264,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.032467217651178264\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136473,\n \"mc2\": 0.6890798826156785,\n \"mc2_stderr\": 0.015559307572072172\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491906\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36087945413191813,\n \"acc_stderr\": 0.013228626753925141\n }\n}\n```", "repo_url": "https://huggingface.co/argilla/DistilabelBeagle14-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T17-49-14.441859.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["**/details_harness|winogrande|5_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T17-49-14.441859.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T17_49_14.441859", "path": ["results_2024-01-23T17-49-14.441859.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T17-49-14.441859.parquet"]}]}]} | 2024-01-23T17:52:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of argilla/DistilabelBeagle14-7B
Dataset automatically created during the evaluation run of model argilla/DistilabelBeagle14-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T17:49:14.441859(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of argilla/DistilabelBeagle14-7B\n\n\n\nDataset automatically created during the evaluation run of model argilla/DistilabelBeagle14-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:49:14.441859(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of argilla/DistilabelBeagle14-7B\n\n\n\nDataset automatically created during the evaluation run of model argilla/DistilabelBeagle14-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T17:49:14.441859(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9a9e21e4fecd17804166ec06a514cf170a8080d9 |
This dataset was created by translating part of [en-fr-translation-dataset](https://www.kaggle.com/datasets/dhruvildave/en-fr-translation-dataset) using [Argos Translate](https://github.com/argosopentech/argos-translate). | klima7/en-pl-translation | [
"task_categories:translation",
"language:pl",
"language:en",
"license:odbl",
"region:us"
] | 2024-01-23T18:03:04+00:00 | {"language": ["pl", "en"], "license": "odbl", "task_categories": ["translation"]} | 2024-01-24T19:26:56+00:00 | [] | [
"pl",
"en"
] | TAGS
#task_categories-translation #language-Polish #language-English #license-odbl #region-us
|
This dataset was created by translating part of en-fr-translation-dataset using Argos Translate. | [] | [
"TAGS\n#task_categories-translation #language-Polish #language-English #license-odbl #region-us \n"
] |
31469b8f77531c89a6893775913e9bfc189712e1 | # Dataset Card for "uf_unsafe_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | yimingzhang/uf_unsafe_v3 | [
"region:us"
] | 2024-01-23T18:23:00+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}], "dataset_info": {"features": [{"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_prefs", "num_bytes": 57674, "num_examples": 122}, {"name": "test_prefs", "num_bytes": 82728, "num_examples": 172}], "download_size": 75399, "dataset_size": 140402}} | 2024-01-23T18:23:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "uf_unsafe_v3"
More Information needed | [
"# Dataset Card for \"uf_unsafe_v3\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"uf_unsafe_v3\"\n\nMore Information needed"
] |
24a4b4d38e9e60a882b2cf3b007e72ec24dc4f60 | # Dataset Card for Cellular Automata
## Dataset Details
This dataset contains 1000 labeled images of cellular automata.
### Dataset Description
[Cellular Automaton](https://mathworld.wolfram.com/ElementaryCellularAutomaton.html) were described by
Stephen Wolphram in [A New King of Science](https://www.wolframscience.com/nks/).
Imagine you have a grid, like a checkerboard. Each square in the grid has a state - on or off, with the state of the square determining its color. There are rulesets (256 of them) that describe how the squares change their state depending on what's happening around them.
The python library [CellPyLib](https://github.com/lantunes/cellpylib) was used to generate the labeled images.
- **Curated by:** Kathy McGuiness
## Uses
One toy use-case is fine-tuning the [aMUSEd](https://huggingface.co/blog/amused) model.
## Dataset Creation
The dataset was created as a demo on how on to create a labeled image dataset.
| kfahn/cellular_automata | [
"license:mit",
"region:us"
] | 2024-01-23T18:28:29+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2343966.0, "num_examples": 1000}], "download_size": 2314338, "dataset_size": 2343966.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}, {"config_name": "metadata", "data_files": "metadata.jsonl"}]} | 2024-01-26T21:51:19+00:00 | [] | [] | TAGS
#license-mit #region-us
| # Dataset Card for Cellular Automata
## Dataset Details
This dataset contains 1000 labeled images of cellular automata.
### Dataset Description
Cellular Automaton were described by
Stephen Wolphram in A New King of Science.
Imagine you have a grid, like a checkerboard. Each square in the grid has a state - on or off, with the state of the square determining its color. There are rulesets (256 of them) that describe how the squares change their state depending on what's happening around them.
The python library CellPyLib was used to generate the labeled images.
- Curated by: Kathy McGuiness
## Uses
One toy use-case is fine-tuning the aMUSEd model.
## Dataset Creation
The dataset was created as a demo on how on to create a labeled image dataset.
| [
"# Dataset Card for Cellular Automata",
"## Dataset Details\n\nThis dataset contains 1000 labeled images of cellular automata.",
"### Dataset Description\n\nCellular Automaton were described by\nStephen Wolphram in A New King of Science.\nImagine you have a grid, like a checkerboard. Each square in the grid has a state - on or off, with the state of the square determining its color. There are rulesets (256 of them) that describe how the squares change their state depending on what's happening around them.\n\nThe python library CellPyLib was used to generate the labeled images.\n\n- Curated by: Kathy McGuiness",
"## Uses\n\nOne toy use-case is fine-tuning the aMUSEd model.",
"## Dataset Creation\n\nThe dataset was created as a demo on how on to create a labeled image dataset."
] | [
"TAGS\n#license-mit #region-us \n",
"# Dataset Card for Cellular Automata",
"## Dataset Details\n\nThis dataset contains 1000 labeled images of cellular automata.",
"### Dataset Description\n\nCellular Automaton were described by\nStephen Wolphram in A New King of Science.\nImagine you have a grid, like a checkerboard. Each square in the grid has a state - on or off, with the state of the square determining its color. There are rulesets (256 of them) that describe how the squares change their state depending on what's happening around them.\n\nThe python library CellPyLib was used to generate the labeled images.\n\n- Curated by: Kathy McGuiness",
"## Uses\n\nOne toy use-case is fine-tuning the aMUSEd model.",
"## Dataset Creation\n\nThe dataset was created as a demo on how on to create a labeled image dataset."
] |
3dd67d5c2cf0a62f05138bae8a1f06c818a30d0e |
# Common Voice Corpus 16.1 Català (up_votes>5)
Dataset extret de [mozilla-foundation/common_voice_16_1](/mozilla-foundation/common_voice_16_1) només els splits train i test del Català i amb up_votes > 5 | xaviviro/common_voice_16_1_ca_up_5 | [
"language:ca",
"license:cc0-1.0",
"region:us"
] | 2024-01-23T18:39:49+00:00 | {"language": ["ca"], "license": "cc0-1.0", "pretty_name": "Common Voice Corpus 16.1 Catal\u00e0 (up_votes>5)", "dataset_info": {"features": [{"name": "client_id", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 48000}}}, {"name": "sentence", "dtype": "string"}, {"name": "up_votes", "dtype": "int64"}, {"name": "down_votes", "dtype": "int64"}, {"name": "age", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "accent", "dtype": "string"}, {"name": "locale", "dtype": "string"}, {"name": "segment", "dtype": "string"}, {"name": "variant", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5810210452.233682, "num_examples": 164061}, {"name": "test", "num_bytes": 19984266.44859813, "num_examples": 525}], "download_size": 4933447772, "dataset_size": 5830194718.68228}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-23T21:00:15+00:00 | [] | [
"ca"
] | TAGS
#language-Catalan #license-cc0-1.0 #region-us
|
# Common Voice Corpus 16.1 Català (up_votes>5)
Dataset extret de mozilla-foundation/common_voice_16_1 només els splits train i test del Català i amb up_votes > 5 | [
"# Common Voice Corpus 16.1 Català (up_votes>5)\n\nDataset extret de mozilla-foundation/common_voice_16_1 només els splits train i test del Català i amb up_votes > 5"
] | [
"TAGS\n#language-Catalan #license-cc0-1.0 #region-us \n",
"# Common Voice Corpus 16.1 Català (up_votes>5)\n\nDataset extret de mozilla-foundation/common_voice_16_1 només els splits train i test del Català i amb up_votes > 5"
] |
2584feadb9c166fddca1047f5d9755d4c3d3c5da |
# Molecular Sets (MOSES): A benchmarking platform for molecular generation models
Deep generative models are rapidly becoming popular for the discovery of new molecules and materials. Such models learn on a large collection of molecular structures and produce novel compounds. In this work, we introduce Molecular Sets (MOSES), a benchmarking platform to support research on machine learning for drug discovery. MOSES implements several popular molecular generation models and provides a set of metrics to evaluate the quality and diversity of generated molecules. With MOSES, we aim to standardize the research on molecular generation and facilitate the sharing and comparison of new models.
__For more details, please refer to the [paper](https://arxiv.org/abs/1811.12823).__
If you are using MOSES in your research paper, please cite us as
```
@article{10.3389/fphar.2020.565644,
title={{M}olecular {S}ets ({MOSES}): {A} {B}enchmarking {P}latform for {M}olecular {G}eneration {M}odels},
author={Polykovskiy, Daniil and Zhebrak, Alexander and Sanchez-Lengeling, Benjamin and Golovanov, Sergey and Tatanov, Oktai and Belyaev, Stanislav and Kurbanov, Rauf and Artamonov, Aleksey and Aladinskiy, Vladimir and Veselov, Mark and Kadurin, Artur and Johansson, Simon and Chen, Hongming and Nikolenko, Sergey and Aspuru-Guzik, Alan and Zhavoronkov, Alex},
journal={Frontiers in Pharmacology},
year={2020}
}
```
## Dataset
We propose [a benchmarking dataset](https://media.githubusercontent.com/media/molecularsets/moses/master/data/dataset_v1.csv) refined from the ZINC database.
The set is based on the ZINC Clean Leads collection. It contains 4,591,276 molecules in total, filtered by molecular weight in the range from 250 to 350 Daltons, a number of rotatable bonds not greater than 7, and XlogP less than or equal to 3.5. We removed molecules containing charged atoms or atoms besides C, N, S, O, F, Cl, Br, H or cycles longer than 8 atoms. The molecules were filtered via medicinal chemistry filters (MCFs) and PAINS filters.
The dataset contains 1,936,962 molecular structures. For experiments, we split the dataset into a training, test and scaffold test sets containing around 1.6M, 176k, and 176k molecules respectively. The scaffold test set contains unique Bemis-Murcko scaffolds that were not present in the training and test sets. We use this set to assess how well the model can generate previously unobserved scaffolds. | katielink/moses | [
"size_categories:1M<n<10M",
"license:mit",
"chemistry",
"arxiv:1811.12823",
"region:us"
] | 2024-01-23T18:42:31+00:00 | {"license": "mit", "size_categories": ["1M<n<10M"], "tags": ["chemistry"]} | 2024-01-23T18:49:23+00:00 | [
"1811.12823"
] | [] | TAGS
#size_categories-1M<n<10M #license-mit #chemistry #arxiv-1811.12823 #region-us
|
# Molecular Sets (MOSES): A benchmarking platform for molecular generation models
Deep generative models are rapidly becoming popular for the discovery of new molecules and materials. Such models learn on a large collection of molecular structures and produce novel compounds. In this work, we introduce Molecular Sets (MOSES), a benchmarking platform to support research on machine learning for drug discovery. MOSES implements several popular molecular generation models and provides a set of metrics to evaluate the quality and diversity of generated molecules. With MOSES, we aim to standardize the research on molecular generation and facilitate the sharing and comparison of new models.
__For more details, please refer to the paper.__
If you are using MOSES in your research paper, please cite us as
## Dataset
We propose a benchmarking dataset refined from the ZINC database.
The set is based on the ZINC Clean Leads collection. It contains 4,591,276 molecules in total, filtered by molecular weight in the range from 250 to 350 Daltons, a number of rotatable bonds not greater than 7, and XlogP less than or equal to 3.5. We removed molecules containing charged atoms or atoms besides C, N, S, O, F, Cl, Br, H or cycles longer than 8 atoms. The molecules were filtered via medicinal chemistry filters (MCFs) and PAINS filters.
The dataset contains 1,936,962 molecular structures. For experiments, we split the dataset into a training, test and scaffold test sets containing around 1.6M, 176k, and 176k molecules respectively. The scaffold test set contains unique Bemis-Murcko scaffolds that were not present in the training and test sets. We use this set to assess how well the model can generate previously unobserved scaffolds. | [
"# Molecular Sets (MOSES): A benchmarking platform for molecular generation models\n\nDeep generative models are rapidly becoming popular for the discovery of new molecules and materials. Such models learn on a large collection of molecular structures and produce novel compounds. In this work, we introduce Molecular Sets (MOSES), a benchmarking platform to support research on machine learning for drug discovery. MOSES implements several popular molecular generation models and provides a set of metrics to evaluate the quality and diversity of generated molecules. With MOSES, we aim to standardize the research on molecular generation and facilitate the sharing and comparison of new models.\n\n__For more details, please refer to the paper.__\n\nIf you are using MOSES in your research paper, please cite us as",
"## Dataset\n\nWe propose a benchmarking dataset refined from the ZINC database.\n\nThe set is based on the ZINC Clean Leads collection. It contains 4,591,276 molecules in total, filtered by molecular weight in the range from 250 to 350 Daltons, a number of rotatable bonds not greater than 7, and XlogP less than or equal to 3.5. We removed molecules containing charged atoms or atoms besides C, N, S, O, F, Cl, Br, H or cycles longer than 8 atoms. The molecules were filtered via medicinal chemistry filters (MCFs) and PAINS filters.\n\nThe dataset contains 1,936,962 molecular structures. For experiments, we split the dataset into a training, test and scaffold test sets containing around 1.6M, 176k, and 176k molecules respectively. The scaffold test set contains unique Bemis-Murcko scaffolds that were not present in the training and test sets. We use this set to assess how well the model can generate previously unobserved scaffolds."
] | [
"TAGS\n#size_categories-1M<n<10M #license-mit #chemistry #arxiv-1811.12823 #region-us \n",
"# Molecular Sets (MOSES): A benchmarking platform for molecular generation models\n\nDeep generative models are rapidly becoming popular for the discovery of new molecules and materials. Such models learn on a large collection of molecular structures and produce novel compounds. In this work, we introduce Molecular Sets (MOSES), a benchmarking platform to support research on machine learning for drug discovery. MOSES implements several popular molecular generation models and provides a set of metrics to evaluate the quality and diversity of generated molecules. With MOSES, we aim to standardize the research on molecular generation and facilitate the sharing and comparison of new models.\n\n__For more details, please refer to the paper.__\n\nIf you are using MOSES in your research paper, please cite us as",
"## Dataset\n\nWe propose a benchmarking dataset refined from the ZINC database.\n\nThe set is based on the ZINC Clean Leads collection. It contains 4,591,276 molecules in total, filtered by molecular weight in the range from 250 to 350 Daltons, a number of rotatable bonds not greater than 7, and XlogP less than or equal to 3.5. We removed molecules containing charged atoms or atoms besides C, N, S, O, F, Cl, Br, H or cycles longer than 8 atoms. The molecules were filtered via medicinal chemistry filters (MCFs) and PAINS filters.\n\nThe dataset contains 1,936,962 molecular structures. For experiments, we split the dataset into a training, test and scaffold test sets containing around 1.6M, 176k, and 176k molecules respectively. The scaffold test set contains unique Bemis-Murcko scaffolds that were not present in the training and test sets. We use this set to assess how well the model can generate previously unobserved scaffolds."
] |
1b75f67f23a11daad9ebd1d68e92d464e02754f9 | # Dataset Card for "downstream-7-flat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sordonia/downstream-7-flat | [
"region:us"
] | 2024-01-23T18:48:42+00:00 | {"dataset_info": {"features": [{"name": "task_name", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 32570209, "num_examples": 114170}], "download_size": 16740726, "dataset_size": 32570209}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-23T20:31:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "downstream-7-flat"
More Information needed | [
"# Dataset Card for \"downstream-7-flat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"downstream-7-flat\"\n\nMore Information needed"
] |
fdfaf7af49de505d9e095d7f540c42e26a73a105 |
# Diverse Restricted JSON Data Extraction
- **Curated by:** The [paraloq analytics](https://www.paraloq.ai) team.
## Uses
1. **Benchmark** restricted JSON data extraction (text + JSON schema -> JSON instance)
2. **Fine-Tune** data extraction model (text + JSON schema -> JSON instance)
3. **Fine-Tune** JSON schema Retrieval model (text -> retriever -> most adequate JSON schema)
### Out-of-Scope Use
Intended for research purposes only.
## Dataset Structure
The data comes with the following fields:
- **title**: The title of the schema.
- **topic**: The general topic of the item. For a list of topis, see below.
- **schema**: The JSON schema specifying the structure of the data.
- **item**: A JSON instance of the schema holding actual data.
- **medium**: The medium of the example data. Examples inlcude "news article", "blog post", "email", "html web page", "conversation", etc.
- **text**: An instance of the given medium, containing all the information held by the item, along with additional information.
A focus of this dataset is to provide a diverse set of items from a wide array of topics. We currently include the following topic areas:
- **simple**: Simple, general, documents such as to-do lists, calendars, recipes, etc. This is the most generic topic and is designed to be easy to exract.
- **medical**: Medical documents such as patient records, prescriptions, test results, etc.
- **ecommerce**: Ecommerce documents such as product listings, shopping carts, order confirmations, etc.
- **business**: Business documents such as invoices, purchase orders, contracts, etc.
- **travel**: Travel documents such as flight bookings, hotel reservations, itineraries, etc.
- **media**: Media documents such as movie reviews, music albums, video games, etc.
- **technology**: Technology documents such as software licenses, API responses, error logs, etc.
- **manufacturing**: Manufacturing documents such as product BOMs, work orders, inspection reports, COAs etc.
## Dataset Creation
### Curation Rationale
We use this dataset to benchmark different models for their ability to extract data from unstructured text in a zero shot fashion, by including the desired JSON schema in the prompt.
The dataset can also be used to fine-tune a model to extract data in a zero-shot manner, feeding text and a target JSON schema. Note that the difficulty here is typically not that the model output is not adhering to the desired JSON schema. This can be enforced by restricing generation using [guidance](https://github.com/guidance-ai/guidance) or [outlines](https://github.com/outlines-dev/outlines). For us, the issue is often that a model would not extract all of the available data.
### Source Data
This data is synthetically generated using Google's Gemini-Pro.
#### Data Collection and Processing
1. Prompt the model to generate a list of JSON schemas representing a diverse set of items.
2. Prompt the model to create instances from each of the schemas.
3. Prompt the model to generate text (in the form of a blog post, server logs, emails, chats, etc.) that contains the information held by the instance.
#### Who are the source data producers?
paraloq analytics is an Austrian AI research and development company based in Vienna.
## Bias, Risks, and Limitations
The data might include biases resulting from the sampling and bias propagation from Google's Gemini-Pro.
## Dataset Card Authors
Max Arrich
| paraloq/json_data_extraction | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"json",
"data-extraction",
"structured-generation",
"restricted-generation",
"ecommerce",
"medical",
"manufacturing",
"server logs",
"news",
"region:us"
] | 2024-01-23T18:51:56+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "Diverse Restricted JSON Data Extraction", "tags": ["json", "data-extraction", "structured-generation", "restricted-generation", "ecommerce", "medical", "manufacturing", "server logs", "news"]} | 2024-02-07T08:43:39+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-English #license-apache-2.0 #json #data-extraction #structured-generation #restricted-generation #ecommerce #medical #manufacturing #server logs #news #region-us
|
# Diverse Restricted JSON Data Extraction
- Curated by: The paraloq analytics team.
## Uses
1. Benchmark restricted JSON data extraction (text + JSON schema -> JSON instance)
2. Fine-Tune data extraction model (text + JSON schema -> JSON instance)
3. Fine-Tune JSON schema Retrieval model (text -> retriever -> most adequate JSON schema)
### Out-of-Scope Use
Intended for research purposes only.
## Dataset Structure
The data comes with the following fields:
- title: The title of the schema.
- topic: The general topic of the item. For a list of topis, see below.
- schema: The JSON schema specifying the structure of the data.
- item: A JSON instance of the schema holding actual data.
- medium: The medium of the example data. Examples inlcude "news article", "blog post", "email", "html web page", "conversation", etc.
- text: An instance of the given medium, containing all the information held by the item, along with additional information.
A focus of this dataset is to provide a diverse set of items from a wide array of topics. We currently include the following topic areas:
- simple: Simple, general, documents such as to-do lists, calendars, recipes, etc. This is the most generic topic and is designed to be easy to exract.
- medical: Medical documents such as patient records, prescriptions, test results, etc.
- ecommerce: Ecommerce documents such as product listings, shopping carts, order confirmations, etc.
- business: Business documents such as invoices, purchase orders, contracts, etc.
- travel: Travel documents such as flight bookings, hotel reservations, itineraries, etc.
- media: Media documents such as movie reviews, music albums, video games, etc.
- technology: Technology documents such as software licenses, API responses, error logs, etc.
- manufacturing: Manufacturing documents such as product BOMs, work orders, inspection reports, COAs etc.
## Dataset Creation
### Curation Rationale
We use this dataset to benchmark different models for their ability to extract data from unstructured text in a zero shot fashion, by including the desired JSON schema in the prompt.
The dataset can also be used to fine-tune a model to extract data in a zero-shot manner, feeding text and a target JSON schema. Note that the difficulty here is typically not that the model output is not adhering to the desired JSON schema. This can be enforced by restricing generation using guidance or outlines. For us, the issue is often that a model would not extract all of the available data.
### Source Data
This data is synthetically generated using Google's Gemini-Pro.
#### Data Collection and Processing
1. Prompt the model to generate a list of JSON schemas representing a diverse set of items.
2. Prompt the model to create instances from each of the schemas.
3. Prompt the model to generate text (in the form of a blog post, server logs, emails, chats, etc.) that contains the information held by the instance.
#### Who are the source data producers?
paraloq analytics is an Austrian AI research and development company based in Vienna.
## Bias, Risks, and Limitations
The data might include biases resulting from the sampling and bias propagation from Google's Gemini-Pro.
## Dataset Card Authors
Max Arrich
| [
"# Diverse Restricted JSON Data Extraction\n\n- Curated by: The paraloq analytics team.",
"## Uses\n\n1. Benchmark restricted JSON data extraction (text + JSON schema -> JSON instance)\n2. Fine-Tune data extraction model (text + JSON schema -> JSON instance)\n3. Fine-Tune JSON schema Retrieval model (text -> retriever -> most adequate JSON schema)",
"### Out-of-Scope Use\n\nIntended for research purposes only.",
"## Dataset Structure\n\nThe data comes with the following fields:\n- title: The title of the schema.\n- topic: The general topic of the item. For a list of topis, see below.\n- schema: The JSON schema specifying the structure of the data.\n- item: A JSON instance of the schema holding actual data.\n- medium: The medium of the example data. Examples inlcude \"news article\", \"blog post\", \"email\", \"html web page\", \"conversation\", etc.\n- text: An instance of the given medium, containing all the information held by the item, along with additional information.\n\nA focus of this dataset is to provide a diverse set of items from a wide array of topics. We currently include the following topic areas:\n\n- simple: Simple, general, documents such as to-do lists, calendars, recipes, etc. This is the most generic topic and is designed to be easy to exract.\n- medical: Medical documents such as patient records, prescriptions, test results, etc.\n- ecommerce: Ecommerce documents such as product listings, shopping carts, order confirmations, etc.\n- business: Business documents such as invoices, purchase orders, contracts, etc.\n- travel: Travel documents such as flight bookings, hotel reservations, itineraries, etc.\n- media: Media documents such as movie reviews, music albums, video games, etc.\n- technology: Technology documents such as software licenses, API responses, error logs, etc.\n- manufacturing: Manufacturing documents such as product BOMs, work orders, inspection reports, COAs etc.",
"## Dataset Creation",
"### Curation Rationale\n\nWe use this dataset to benchmark different models for their ability to extract data from unstructured text in a zero shot fashion, by including the desired JSON schema in the prompt.\nThe dataset can also be used to fine-tune a model to extract data in a zero-shot manner, feeding text and a target JSON schema. Note that the difficulty here is typically not that the model output is not adhering to the desired JSON schema. This can be enforced by restricing generation using guidance or outlines. For us, the issue is often that a model would not extract all of the available data.",
"### Source Data\n\nThis data is synthetically generated using Google's Gemini-Pro.",
"#### Data Collection and Processing\n\n1. Prompt the model to generate a list of JSON schemas representing a diverse set of items.\n2. Prompt the model to create instances from each of the schemas.\n3. Prompt the model to generate text (in the form of a blog post, server logs, emails, chats, etc.) that contains the information held by the instance.",
"#### Who are the source data producers?\n\nparaloq analytics is an Austrian AI research and development company based in Vienna.",
"## Bias, Risks, and Limitations\n\nThe data might include biases resulting from the sampling and bias propagation from Google's Gemini-Pro.",
"## Dataset Card Authors\n\nMax Arrich"
] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-English #license-apache-2.0 #json #data-extraction #structured-generation #restricted-generation #ecommerce #medical #manufacturing #server logs #news #region-us \n",
"# Diverse Restricted JSON Data Extraction\n\n- Curated by: The paraloq analytics team.",
"## Uses\n\n1. Benchmark restricted JSON data extraction (text + JSON schema -> JSON instance)\n2. Fine-Tune data extraction model (text + JSON schema -> JSON instance)\n3. Fine-Tune JSON schema Retrieval model (text -> retriever -> most adequate JSON schema)",
"### Out-of-Scope Use\n\nIntended for research purposes only.",
"## Dataset Structure\n\nThe data comes with the following fields:\n- title: The title of the schema.\n- topic: The general topic of the item. For a list of topis, see below.\n- schema: The JSON schema specifying the structure of the data.\n- item: A JSON instance of the schema holding actual data.\n- medium: The medium of the example data. Examples inlcude \"news article\", \"blog post\", \"email\", \"html web page\", \"conversation\", etc.\n- text: An instance of the given medium, containing all the information held by the item, along with additional information.\n\nA focus of this dataset is to provide a diverse set of items from a wide array of topics. We currently include the following topic areas:\n\n- simple: Simple, general, documents such as to-do lists, calendars, recipes, etc. This is the most generic topic and is designed to be easy to exract.\n- medical: Medical documents such as patient records, prescriptions, test results, etc.\n- ecommerce: Ecommerce documents such as product listings, shopping carts, order confirmations, etc.\n- business: Business documents such as invoices, purchase orders, contracts, etc.\n- travel: Travel documents such as flight bookings, hotel reservations, itineraries, etc.\n- media: Media documents such as movie reviews, music albums, video games, etc.\n- technology: Technology documents such as software licenses, API responses, error logs, etc.\n- manufacturing: Manufacturing documents such as product BOMs, work orders, inspection reports, COAs etc.",
"## Dataset Creation",
"### Curation Rationale\n\nWe use this dataset to benchmark different models for their ability to extract data from unstructured text in a zero shot fashion, by including the desired JSON schema in the prompt.\nThe dataset can also be used to fine-tune a model to extract data in a zero-shot manner, feeding text and a target JSON schema. Note that the difficulty here is typically not that the model output is not adhering to the desired JSON schema. This can be enforced by restricing generation using guidance or outlines. For us, the issue is often that a model would not extract all of the available data.",
"### Source Data\n\nThis data is synthetically generated using Google's Gemini-Pro.",
"#### Data Collection and Processing\n\n1. Prompt the model to generate a list of JSON schemas representing a diverse set of items.\n2. Prompt the model to create instances from each of the schemas.\n3. Prompt the model to generate text (in the form of a blog post, server logs, emails, chats, etc.) that contains the information held by the instance.",
"#### Who are the source data producers?\n\nparaloq analytics is an Austrian AI research and development company based in Vienna.",
"## Bias, Risks, and Limitations\n\nThe data might include biases resulting from the sampling and bias propagation from Google's Gemini-Pro.",
"## Dataset Card Authors\n\nMax Arrich"
] |
07dcc4abda970efe25ab4736c34fe9eba493e6f7 | # Responsible Media Content Matrix (RMCM): Overview
The RMCM is a strategic tool developed to address various forms of bias and unethical practices in media reporting. It encompasses several key categories, each focusing on a specific type of bias or ethical concern. The primary objective of the RMCM is to foster responsible journalism and content creation by providing clear guidelines on identifying and rectifying biased or harmful content.
## Key Categories of RMCM:
### Toxicity
This includes content that is aggressive, rude, disrespectful, or contributes to a hostile environment. It particularly focuses on hate speech and other forms of communication that incite hatred or violence based on race, religion, gender, or sexual orientation.
### Stereotyping
This category deals with generalized or oversimplified beliefs about specific groups or communities. It aims to identify and correct content that perpetuates stereotypes, particularly those related to sexual themes, age, gender biases, or cultural misconceptions.
### Bias
This refers to any content showing an unjustifiable preference or prejudice towards certain viewpoints, groups, or individuals. It includes both overt bias and more subtle forms of partiality that can skew the presentation of information.
### Harm
This encompasses content that could cause distress or harm to individuals or society. It includes sensationalized or unethical reporting of criminal behavior, as well as content that inappropriately focuses on or glorifies violence and weaponry.
The RMCM serves as a guide for content creators, editors, and journalists, encouraging them to critically assess their work for potential biases and harmful elements. By adhering to the principles of the RMCM, media professionals can contribute to a more ethical, balanced, and responsible media landscape.
| newsmediabias/Bias-Debias-Alpaca | [
"region:us"
] | 2024-01-23T19:11:39+00:00 | {} | 2024-01-25T16:09:10+00:00 | [] | [] | TAGS
#region-us
| # Responsible Media Content Matrix (RMCM): Overview
The RMCM is a strategic tool developed to address various forms of bias and unethical practices in media reporting. It encompasses several key categories, each focusing on a specific type of bias or ethical concern. The primary objective of the RMCM is to foster responsible journalism and content creation by providing clear guidelines on identifying and rectifying biased or harmful content.
## Key Categories of RMCM:
### Toxicity
This includes content that is aggressive, rude, disrespectful, or contributes to a hostile environment. It particularly focuses on hate speech and other forms of communication that incite hatred or violence based on race, religion, gender, or sexual orientation.
### Stereotyping
This category deals with generalized or oversimplified beliefs about specific groups or communities. It aims to identify and correct content that perpetuates stereotypes, particularly those related to sexual themes, age, gender biases, or cultural misconceptions.
### Bias
This refers to any content showing an unjustifiable preference or prejudice towards certain viewpoints, groups, or individuals. It includes both overt bias and more subtle forms of partiality that can skew the presentation of information.
### Harm
This encompasses content that could cause distress or harm to individuals or society. It includes sensationalized or unethical reporting of criminal behavior, as well as content that inappropriately focuses on or glorifies violence and weaponry.
The RMCM serves as a guide for content creators, editors, and journalists, encouraging them to critically assess their work for potential biases and harmful elements. By adhering to the principles of the RMCM, media professionals can contribute to a more ethical, balanced, and responsible media landscape.
| [
"# Responsible Media Content Matrix (RMCM): Overview\n\nThe RMCM is a strategic tool developed to address various forms of bias and unethical practices in media reporting. It encompasses several key categories, each focusing on a specific type of bias or ethical concern. The primary objective of the RMCM is to foster responsible journalism and content creation by providing clear guidelines on identifying and rectifying biased or harmful content.",
"## Key Categories of RMCM:",
"### Toxicity\nThis includes content that is aggressive, rude, disrespectful, or contributes to a hostile environment. It particularly focuses on hate speech and other forms of communication that incite hatred or violence based on race, religion, gender, or sexual orientation.",
"### Stereotyping\nThis category deals with generalized or oversimplified beliefs about specific groups or communities. It aims to identify and correct content that perpetuates stereotypes, particularly those related to sexual themes, age, gender biases, or cultural misconceptions.",
"### Bias\nThis refers to any content showing an unjustifiable preference or prejudice towards certain viewpoints, groups, or individuals. It includes both overt bias and more subtle forms of partiality that can skew the presentation of information.",
"### Harm\nThis encompasses content that could cause distress or harm to individuals or society. It includes sensationalized or unethical reporting of criminal behavior, as well as content that inappropriately focuses on or glorifies violence and weaponry.\n\nThe RMCM serves as a guide for content creators, editors, and journalists, encouraging them to critically assess their work for potential biases and harmful elements. By adhering to the principles of the RMCM, media professionals can contribute to a more ethical, balanced, and responsible media landscape."
] | [
"TAGS\n#region-us \n",
"# Responsible Media Content Matrix (RMCM): Overview\n\nThe RMCM is a strategic tool developed to address various forms of bias and unethical practices in media reporting. It encompasses several key categories, each focusing on a specific type of bias or ethical concern. The primary objective of the RMCM is to foster responsible journalism and content creation by providing clear guidelines on identifying and rectifying biased or harmful content.",
"## Key Categories of RMCM:",
"### Toxicity\nThis includes content that is aggressive, rude, disrespectful, or contributes to a hostile environment. It particularly focuses on hate speech and other forms of communication that incite hatred or violence based on race, religion, gender, or sexual orientation.",
"### Stereotyping\nThis category deals with generalized or oversimplified beliefs about specific groups or communities. It aims to identify and correct content that perpetuates stereotypes, particularly those related to sexual themes, age, gender biases, or cultural misconceptions.",
"### Bias\nThis refers to any content showing an unjustifiable preference or prejudice towards certain viewpoints, groups, or individuals. It includes both overt bias and more subtle forms of partiality that can skew the presentation of information.",
"### Harm\nThis encompasses content that could cause distress or harm to individuals or society. It includes sensationalized or unethical reporting of criminal behavior, as well as content that inappropriately focuses on or glorifies violence and weaponry.\n\nThe RMCM serves as a guide for content creators, editors, and journalists, encouraging them to critically assess their work for potential biases and harmful elements. By adhering to the principles of the RMCM, media professionals can contribute to a more ethical, balanced, and responsible media landscape."
] |
8c09a96ed9ea8d5ebbb823c16e354b7f8a06702f | # Dataset Card for "semeval_subtask2_conversations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | dim/semeval_subtask2_conversations | [
"region:us"
] | 2024-01-23T19:14:49+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "conversation_ID", "dtype": "int64"}, {"name": "conversation", "list": [{"name": "emotion", "dtype": "string"}, {"name": "speaker", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "utterance_ID", "dtype": "int64"}, {"name": "video_name", "dtype": "string"}]}, {"name": "emotion-cause_pairs", "sequence": {"sequence": "string"}}], "splits": [{"name": "train", "num_bytes": 1409288.2445414846, "num_examples": 1264}, {"name": "test", "num_bytes": 122643.75545851528, "num_examples": 110}], "download_size": 585135, "dataset_size": 1531932.0}} | 2024-01-23T19:14:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "semeval_subtask2_conversations"
More Information needed | [
"# Dataset Card for \"semeval_subtask2_conversations\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"semeval_subtask2_conversations\"\n\nMore Information needed"
] |
ca63e062c97c1affa4c658e2bad8d74428971fde | # lilac/MMLU
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/cais/mmlu](https://huggingface.co/datasets/cais/mmlu)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-MMLU
```
or from python with:
```py
ll.download("lilacai/lilac-MMLU")
```
| lilacai/lilac-MMLU | [
"Lilac",
"region:us"
] | 2024-01-23T19:16:39+00:00 | {"tags": ["Lilac"]} | 2024-01-23T19:16:42+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/MMLU
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/MMLU\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/MMLU\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
0dab92c42f72412b391b6e2bd453c156c9593caf | # lilac/mosaic-instruct-v3
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/mosaicml/instruct-v3](https://huggingface.co/datasets/mosaicml/instruct-v3)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-mosaic-instruct-v3
```
or from python with:
```py
ll.download("lilacai/lilac-mosaic-instruct-v3")
```
| lilacai/lilac-mosaic-instruct-v3 | [
"Lilac",
"region:us"
] | 2024-01-23T19:24:01+00:00 | {"tags": ["Lilac"]} | 2024-01-26T14:58:38+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/mosaic-instruct-v3
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/mosaic-instruct-v3\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/mosaic-instruct-v3\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
d725ac10d18fa502473f4520704541fa075c4bbc |
# Dataset Card for Evaluation run of FelixChao/WestSeverus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/WestSeverus-7B](https://huggingface.co/FelixChao/WestSeverus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__WestSeverus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T19:22:25.725845](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-7B/blob/main/results_2024-01-23T19-22-25.725845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546959623905765,
"acc_stderr": 0.032069118843639784,
"acc_norm": 0.6545111906904139,
"acc_norm_stderr": 0.032737272576266796,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6289063808843572,
"mc2_stderr": 0.015244465231660157
},
"harness|arc:challenge|25": {
"acc": 0.674061433447099,
"acc_stderr": 0.013697432466693247,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725225
},
"harness|hellaswag|10": {
"acc": 0.6905994821748656,
"acc_stderr": 0.0046130181011853014,
"acc_norm": 0.8746265684126668,
"acc_norm_stderr": 0.0033046510372765534
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291936,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500673,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500673
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031215,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031215
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6289063808843572,
"mc2_stderr": 0.015244465231660157
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222789
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.01265254413318614
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__WestSeverus-7B | [
"region:us"
] | 2024-01-23T19:24:45+00:00 | {"pretty_name": "Evaluation run of FelixChao/WestSeverus-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/WestSeverus-7B](https://huggingface.co/FelixChao/WestSeverus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__WestSeverus-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T19:22:25.725845](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WestSeverus-7B/blob/main/results_2024-01-23T19-22-25.725845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546959623905765,\n \"acc_stderr\": 0.032069118843639784,\n \"acc_norm\": 0.6545111906904139,\n \"acc_norm_stderr\": 0.032737272576266796,\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6289063808843572,\n \"mc2_stderr\": 0.015244465231660157\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.674061433447099,\n \"acc_stderr\": 0.013697432466693247,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725225\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6905994821748656,\n \"acc_stderr\": 0.0046130181011853014,\n \"acc_norm\": 0.8746265684126668,\n \"acc_norm_stderr\": 0.0033046510372765534\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291936,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291936\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n \"acc_stderr\": 0.016482782187500673,\n \"acc_norm\": 0.41564245810055866,\n \"acc_norm_stderr\": 0.016482782187500673\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031215,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031215\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6289063808843572,\n \"mc2_stderr\": 0.015244465231660157\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \"acc_stderr\": 0.01265254413318614\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/WestSeverus-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|arc:challenge|25_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|gsm8k|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hellaswag|10_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["**/details_harness|winogrande|5_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T19-22-25.725845.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T19_22_25.725845", "path": ["results_2024-01-23T19-22-25.725845.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T19-22-25.725845.parquet"]}]}]} | 2024-01-23T19:25:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/WestSeverus-7B
Dataset automatically created during the evaluation run of model FelixChao/WestSeverus-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T19:22:25.725845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/WestSeverus-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/WestSeverus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T19:22:25.725845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/WestSeverus-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/WestSeverus-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T19:22:25.725845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0bee382b7b74faba1d6c211c1e7ea994be573ab1 | # lilac/ARC-Easy
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/allenai/ai2_arc](https://huggingface.co/datasets/allenai/ai2_arc)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-ARC-Easy
```
or from python with:
```py
ll.download("lilacai/lilac-ARC-Easy")
```
| lilacai/lilac-ARC-Easy | [
"Lilac",
"region:us"
] | 2024-01-23T19:29:58+00:00 | {"tags": ["Lilac"]} | 2024-01-23T19:44:51+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/ARC-Easy
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/ARC-Easy\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/ARC-Easy\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
2b6253f59770cb75cee6f71cf424b634f6fdd16b | # lilac/ARC-Challenge
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/allenai/ai2_arc](https://huggingface.co/datasets/allenai/ai2_arc)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-ARC-Challenge
```
or from python with:
```py
ll.download("lilacai/lilac-ARC-Challenge")
```
| lilacai/lilac-ARC-Challenge | [
"Lilac",
"region:us"
] | 2024-01-23T19:30:00+00:00 | {"tags": ["Lilac"]} | 2024-01-23T19:44:52+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/ARC-Challenge
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/ARC-Challenge\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/ARC-Challenge\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
cbaee6fffe479d447573249564240359024e4fa9 | # Dataset Card
This a synthetic dataset of arXiv-style research paper abstracts and tweets summarizing them used as a demonstration of the [DataDreamer 🤖💤 library](https://datadreamer.dev/docs/latest/). It was used to train an ["Abstract to Tweet" model](https://huggingface.co/datadreamer-dev/abstracts_to_tweet_model).
---
This dataset was produced with [DataDreamer 🤖💤](https://datadreamer.dev). The synthetic dataset card can be found [here](datadreamer.json). | datadreamer-dev/abstracts_and_tweets | [
"size_categories:1K<n<10K",
"datadreamer",
"datadreamer-0.1.0",
"synthetic",
"gpt-4",
"region:us"
] | 2024-01-23T19:53:25+00:00 | {"size_categories": ["1K<n<10K"], "dataset_info": {"features": [{"name": "abstracts", "dtype": "string"}, {"name": "prompts", "dtype": "string"}, {"name": "tweets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3127163, "num_examples": 900}, {"name": "validation", "num_bytes": 343839, "num_examples": 100}], "download_size": 1765300, "dataset_size": 3471002}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "library_name": "datadreamer", "tags": ["datadreamer", "datadreamer-0.1.0", "synthetic", "gpt-4", "gpt-4"]} | 2024-02-01T22:23:54+00:00 | [] | [] | TAGS
#size_categories-1K<n<10K #datadreamer #datadreamer-0.1.0 #synthetic #gpt-4 #region-us
| # Dataset Card
This a synthetic dataset of arXiv-style research paper abstracts and tweets summarizing them used as a demonstration of the DataDreamer library. It was used to train an "Abstract to Tweet" model.
---
This dataset was produced with DataDreamer . The synthetic dataset card can be found here. | [
"# Dataset Card\n\nThis a synthetic dataset of arXiv-style research paper abstracts and tweets summarizing them used as a demonstration of the DataDreamer library. It was used to train an \"Abstract to Tweet\" model.\n\n---\nThis dataset was produced with DataDreamer . The synthetic dataset card can be found here."
] | [
"TAGS\n#size_categories-1K<n<10K #datadreamer #datadreamer-0.1.0 #synthetic #gpt-4 #region-us \n",
"# Dataset Card\n\nThis a synthetic dataset of arXiv-style research paper abstracts and tweets summarizing them used as a demonstration of the DataDreamer library. It was used to train an \"Abstract to Tweet\" model.\n\n---\nThis dataset was produced with DataDreamer . The synthetic dataset card can be found here."
] |
c1a63a1e8fd10188e2762f2c80ac1b2d9c7cd370 |
# Pseudostreaming Malaysian Youtube videos using Whisper Large V3
Original dataset at https://huggingface.co/datasets/mesolitica/pseudolabel-malaysian-youtube-whisper-large-v3
We use https://huggingface.co/mesolitica/conformer-medium-mixed to generate pseudostreaming dataset, source code at https://github.com/mesolitica/malaysian-dataset/tree/master/speech-to-text-semisupervised/pseudostreaming-whisper
Total 40486.589364839296 hours.
data format from [processed.jsonl](processed.jsonl),
```json
[
{
"text": "dalam sukan olimpik dan paralimpik tokyo dua ribu dua puluh",
"start": 3.52,
"end": 6.46,
"audio_filename": "processed-audio/1-225586-0.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
},
{
"text": "to azizul has",
"start": 7.12,
"end": 8.179999999999998,
"audio_filename": "processed-audio/1-225586-1.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
},
{
"text": "awang meraih kilauan perak untuk malaysia dalam sukan olimpik tokyo dua ribu dua puluh tampil sebagai satu satunya wakil asia bagaimanapun beliau terpaksa akur di tangan pelumba great britain jason",
"start": 8.4,
"end": 22.98,
"audio_filename": "processed-audio/1-225586-2.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
},
{
"text": "y yang meraih pingat emas",
"start": 23.28,
"end": 25.060000000000002,
"audio_filename": "processed-audio/1-225586-3.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
}
]
```
## how-to
```bash
git clone https://huggingface.co/datasets/mesolitica/pseudostreaming-malaya-speech-stt
cd pseudostreaming-malaya-speech-stt
wget https://www.7-zip.org/a/7z2301-linux-x64.tar.xz
tar -xf 7z2301-linux-x64.tar.xz
./7zz x processed-audio.7z.001 -y -mmt40
``` | mesolitica/pseudostreaming-malaysian-youtube-whisper-large-v3 | [
"task_categories:automatic-speech-recognition",
"language:ms",
"license:mit",
"region:us"
] | 2024-01-23T19:55:01+00:00 | {"language": ["ms"], "license": "mit", "task_categories": ["automatic-speech-recognition"]} | 2024-02-10T03:45:27+00:00 | [] | [
"ms"
] | TAGS
#task_categories-automatic-speech-recognition #language-Malay (macrolanguage) #license-mit #region-us
|
# Pseudostreaming Malaysian Youtube videos using Whisper Large V3
Original dataset at URL
We use URL to generate pseudostreaming dataset, source code at URL
Total 40486.589364839296 hours.
data format from URL,
## how-to
| [
"# Pseudostreaming Malaysian Youtube videos using Whisper Large V3\n\nOriginal dataset at URL\n\nWe use URL to generate pseudostreaming dataset, source code at URL\n\nTotal 40486.589364839296 hours.\n\ndata format from URL,",
"## how-to"
] | [
"TAGS\n#task_categories-automatic-speech-recognition #language-Malay (macrolanguage) #license-mit #region-us \n",
"# Pseudostreaming Malaysian Youtube videos using Whisper Large V3\n\nOriginal dataset at URL\n\nWe use URL to generate pseudostreaming dataset, source code at URL\n\nTotal 40486.589364839296 hours.\n\ndata format from URL,",
"## how-to"
] |
773e3e58375a4ca8e98f80589535834479cf9479 |
# Dataset Card for Evaluation run of Weyaxi/Einstein-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-7B](https://huggingface.co/Weyaxi/Einstein-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Einstein-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T20:03:38.754499](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-7B/blob/main/results_2024-01-23T20-03-38.754499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6263331890948844,
"acc_stderr": 0.03267477019841745,
"acc_norm": 0.6321088930020295,
"acc_norm_stderr": 0.033337201077586746,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237269,
"mc2": 0.4254749966629292,
"mc2_stderr": 0.014478834236572929
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522082,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.014212444980651892
},
"harness|hellaswag|10": {
"acc": 0.6474805815574587,
"acc_stderr": 0.004767782256040993,
"acc_norm": 0.8434574785899224,
"acc_norm_stderr": 0.0036262628054422185
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073406,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073406
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854054,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854054
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899129,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899129
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.01577491142238163,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.01577491142238163
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.023929155517351305,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.023929155517351305
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.02645722506781103,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.02645722506781103
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190442,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237269,
"mc2": 0.4254749966629292,
"mc2_stderr": 0.014478834236572929
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126737
},
"harness|gsm8k|5": {
"acc": 0.3601213040181956,
"acc_stderr": 0.013222559423250497
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Einstein-7B | [
"region:us"
] | 2024-01-23T20:05:58+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Einstein-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-7B](https://huggingface.co/Weyaxi/Einstein-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Einstein-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T20:03:38.754499](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-7B/blob/main/results_2024-01-23T20-03-38.754499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6263331890948844,\n \"acc_stderr\": 0.03267477019841745,\n \"acc_norm\": 0.6321088930020295,\n \"acc_norm_stderr\": 0.033337201077586746,\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237269,\n \"mc2\": 0.4254749966629292,\n \"mc2_stderr\": 0.014478834236572929\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522082,\n \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.014212444980651892\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6474805815574587,\n \"acc_stderr\": 0.004767782256040993,\n \"acc_norm\": 0.8434574785899224,\n \"acc_norm_stderr\": 0.0036262628054422185\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073406,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073406\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854054,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854054\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899129,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899129\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n \"acc_stderr\": 0.01577491142238163,\n \"acc_norm\": 0.3340782122905028,\n \"acc_norm_stderr\": 0.01577491142238163\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.023929155517351305,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.023929155517351305\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.02645722506781103,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.02645722506781103\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.012732398286190442,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.012732398286190442\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653697,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653697\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237269,\n \"mc2\": 0.4254749966629292,\n \"mc2_stderr\": 0.014478834236572929\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126737\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3601213040181956,\n \"acc_stderr\": 0.013222559423250497\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Einstein-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|arc:challenge|25_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|gsm8k|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hellaswag|10_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T20-03-38.754499.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["**/details_harness|winogrande|5_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T20-03-38.754499.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T20_03_38.754499", "path": ["results_2024-01-23T20-03-38.754499.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T20-03-38.754499.parquet"]}]}]} | 2024-01-23T20:06:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Einstein-7B
Dataset automatically created during the evaluation run of model Weyaxi/Einstein-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T20:03:38.754499(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Einstein-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Einstein-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T20:03:38.754499(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Einstein-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Einstein-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T20:03:38.754499(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8dd82e37de6e9009df8e3d6001e88abdbf129338 |
# Dataset Card for Evaluation run of BarryFutureman/NeuralLake-Variant1-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/NeuralLake-Variant1-7B](https://huggingface.co/BarryFutureman/NeuralLake-Variant1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T20:08:48.201286](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B/blob/main/results_2024-01-23T20-08-48.201286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527879049855548,
"acc_stderr": 0.032052113329256254,
"acc_norm": 0.652189910746759,
"acc_norm_stderr": 0.032721608531391104,
"mc1": 0.5483476132190942,
"mc1_stderr": 0.017421480300277643,
"mc2": 0.6837155338410112,
"mc2_stderr": 0.015180251006560648
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.01336308010724448,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.0129550659637107
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.004495891440519419,
"acc_norm": 0.8844851623182632,
"acc_norm_stderr": 0.003189889789404668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092437,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156861,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156861
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.01657899743549672,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.01657899743549672
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5483476132190942,
"mc1_stderr": 0.017421480300277643,
"mc2": 0.6837155338410112,
"mc2_stderr": 0.015180251006560648
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775777
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131709
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B | [
"region:us"
] | 2024-01-23T20:11:08+00:00 | {"pretty_name": "Evaluation run of BarryFutureman/NeuralLake-Variant1-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarryFutureman/NeuralLake-Variant1-7B](https://huggingface.co/BarryFutureman/NeuralLake-Variant1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T20:08:48.201286](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__NeuralLake-Variant1-7B/blob/main/results_2024-01-23T20-08-48.201286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527879049855548,\n \"acc_stderr\": 0.032052113329256254,\n \"acc_norm\": 0.652189910746759,\n \"acc_norm_stderr\": 0.032721608531391104,\n \"mc1\": 0.5483476132190942,\n \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6837155338410112,\n \"mc2_stderr\": 0.015180251006560648\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.01336308010724448,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.0129550659637107\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n \"acc_stderr\": 0.004495891440519419,\n \"acc_norm\": 0.8844851623182632,\n \"acc_norm_stderr\": 0.003189889789404668\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092437,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092437\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n \"acc_stderr\": 0.01657899743549672,\n \"acc_norm\": 0.4346368715083799,\n \"acc_norm_stderr\": 0.01657899743549672\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5483476132190942,\n \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6837155338410112,\n \"mc2_stderr\": 0.015180251006560648\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.012705685723131709\n }\n}\n```", "repo_url": "https://huggingface.co/BarryFutureman/NeuralLake-Variant1-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|arc:challenge|25_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|gsm8k|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hellaswag|10_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["**/details_harness|winogrande|5_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T20-08-48.201286.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T20_08_48.201286", "path": ["results_2024-01-23T20-08-48.201286.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T20-08-48.201286.parquet"]}]}]} | 2024-01-23T20:11:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarryFutureman/NeuralLake-Variant1-7B
Dataset automatically created during the evaluation run of model BarryFutureman/NeuralLake-Variant1-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T20:08:48.201286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarryFutureman/NeuralLake-Variant1-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/NeuralLake-Variant1-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T20:08:48.201286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarryFutureman/NeuralLake-Variant1-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/NeuralLake-Variant1-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T20:08:48.201286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e1d9b458a57f92769b1a09a088f96f1495fd6ef8 |
## Dataset Details
### Dataset Description
TP4 is a comprehensive dataset containing a curated collection of questions and answers from Stack Overflow. Focused on the realms of Python programming, NumPy, Pandas, TensorFlow, and PyTorch, TP4 includes essential attributes such as question ID, title, question body, answer body, associated tags, and score. This dataset is designed to facilitate research, analysis, and exploration of inquiries and solutions within the Python and machine learning communities on Stack Overflow.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
-Question ID: Unique identifiers for each question, facilitating easy referencing and linkage.
-Title: Concise titles summarizing the essence of each question.
-Question and Answer Bodies: Rich textual content providing detailed context and solutions.
-Tags: Categorization labels such as 'python', 'numpy', 'pandas', 'tensorflow', and 'pytorch' for efficient filtering.
-Score: Numerical representation of the community's evaluation of the question or answer.
## Dataset Card Authors
SYED HASAN ABBAS
| Syed-Hasan-8503/StackOverflow-TP4-1M | [
"task_categories:question-answering",
"size_categories:1M<n<10M",
"language:en",
"code",
"region:us"
] | 2024-01-23T20:36:27+00:00 | {"language": ["en"], "size_categories": ["1M<n<10M"], "task_categories": ["question-answering"], "pretty_name": "StackOverflow-TP4-1M", "tags": ["code"]} | 2024-01-23T21:29:41+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-1M<n<10M #language-English #code #region-us
|
## Dataset Details
### Dataset Description
TP4 is a comprehensive dataset containing a curated collection of questions and answers from Stack Overflow. Focused on the realms of Python programming, NumPy, Pandas, TensorFlow, and PyTorch, TP4 includes essential attributes such as question ID, title, question body, answer body, associated tags, and score. This dataset is designed to facilitate research, analysis, and exploration of inquiries and solutions within the Python and machine learning communities on Stack Overflow.
## Dataset Structure
-Question ID: Unique identifiers for each question, facilitating easy referencing and linkage.
-Title: Concise titles summarizing the essence of each question.
-Question and Answer Bodies: Rich textual content providing detailed context and solutions.
-Tags: Categorization labels such as 'python', 'numpy', 'pandas', 'tensorflow', and 'pytorch' for efficient filtering.
-Score: Numerical representation of the community's evaluation of the question or answer.
## Dataset Card Authors
SYED HASAN ABBAS
| [
"## Dataset Details",
"### Dataset Description\n\n\nTP4 is a comprehensive dataset containing a curated collection of questions and answers from Stack Overflow. Focused on the realms of Python programming, NumPy, Pandas, TensorFlow, and PyTorch, TP4 includes essential attributes such as question ID, title, question body, answer body, associated tags, and score. This dataset is designed to facilitate research, analysis, and exploration of inquiries and solutions within the Python and machine learning communities on Stack Overflow.",
"## Dataset Structure\n\n\n-Question ID: Unique identifiers for each question, facilitating easy referencing and linkage.\n-Title: Concise titles summarizing the essence of each question.\n-Question and Answer Bodies: Rich textual content providing detailed context and solutions.\n-Tags: Categorization labels such as 'python', 'numpy', 'pandas', 'tensorflow', and 'pytorch' for efficient filtering.\n-Score: Numerical representation of the community's evaluation of the question or answer.",
"## Dataset Card Authors\n\nSYED HASAN ABBAS"
] | [
"TAGS\n#task_categories-question-answering #size_categories-1M<n<10M #language-English #code #region-us \n",
"## Dataset Details",
"### Dataset Description\n\n\nTP4 is a comprehensive dataset containing a curated collection of questions and answers from Stack Overflow. Focused on the realms of Python programming, NumPy, Pandas, TensorFlow, and PyTorch, TP4 includes essential attributes such as question ID, title, question body, answer body, associated tags, and score. This dataset is designed to facilitate research, analysis, and exploration of inquiries and solutions within the Python and machine learning communities on Stack Overflow.",
"## Dataset Structure\n\n\n-Question ID: Unique identifiers for each question, facilitating easy referencing and linkage.\n-Title: Concise titles summarizing the essence of each question.\n-Question and Answer Bodies: Rich textual content providing detailed context and solutions.\n-Tags: Categorization labels such as 'python', 'numpy', 'pandas', 'tensorflow', and 'pytorch' for efficient filtering.\n-Score: Numerical representation of the community's evaluation of the question or answer.",
"## Dataset Card Authors\n\nSYED HASAN ABBAS"
] |
779e19d2158d72fe3d5aae4d0f307f10657e418a | # Dataset Card for "self-reward-dev1706042310"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/self-reward-dev1706042310 | [
"region:us"
] | 2024-01-23T20:39:32+00:00 | {"dataset_info": {"features": [{"name": "llm_as_a_judge_prompt", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft_sft", "num_bytes": 717646, "num_examples": 128}], "download_size": 287491, "dataset_size": 717646}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft_sft", "path": "data/train_sft_sft-*"}]}]} | 2024-01-23T20:39:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "self-reward-dev1706042310"
More Information needed | [
"# Dataset Card for \"self-reward-dev1706042310\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"self-reward-dev1706042310\"\n\nMore Information needed"
] |
f3b7a12716cc2e5e93cf40d0143d656ca470ee6e | # Dataset Card for "UC-mix-llamav2-fromat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | bcui19/UC-mix-llamav2-fromat | [
"region:us"
] | 2024-01-23T20:41:10+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 595846376, "num_examples": 171968}], "download_size": 303015618, "dataset_size": 595846376}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-23T20:41:28+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "UC-mix-llamav2-fromat"
More Information needed | [
"# Dataset Card for \"UC-mix-llamav2-fromat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"UC-mix-llamav2-fromat\"\n\nMore Information needed"
] |
b4900cef3d8e66288b728b6476bffae4d5ddda44 | # Dataset Card for "evesix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sam-mosaic/evesix-llama-fmt | [
"region:us"
] | 2024-01-23T20:55:23+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 775938568, "num_examples": 486455}], "download_size": 0, "dataset_size": 775938568}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-30T11:00:08+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "evesix"
More Information needed | [
"# Dataset Card for \"evesix\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"evesix\"\n\nMore Information needed"
] |
8409e8a2e7ac5daf34021b19742e2fb8dc13c578 | # Dataset Card for "self-reward-dev1706043619"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/self-reward-dev1706043619 | [
"region:us"
] | 2024-01-23T21:03:05+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "candidate_0", "dtype": "string"}, {"name": "score_0", "dtype": "float64"}, {"name": "candidate_1", "dtype": "string"}, {"name": "score_1", "dtype": "float64"}, {"name": "candidate_2", "dtype": "string"}, {"name": "score_2", "dtype": "float64"}, {"name": "candidate_3", "dtype": "string"}, {"name": "score_3", "dtype": "float64"}, {"name": "chosen", "dtype": "string"}, {"name": "chosen_score", "dtype": "float64"}, {"name": "chosen_idx", "dtype": "int64"}, {"name": "rejected", "dtype": "string"}, {"name": "rejected_score", "dtype": "float64"}, {"name": "rejected_idx", "dtype": "int64"}], "splits": [{"name": "train_sft_sft", "num_bytes": 54843, "num_examples": 4}, {"name": "test_sft_sft", "num_bytes": 41591, "num_examples": 4}, {"name": "train_gen_sft", "num_bytes": 63759, "num_examples": 4}, {"name": "test_gen_sft", "num_bytes": 57101, "num_examples": 4}], "download_size": 386014, "dataset_size": 217294}, "configs": [{"config_name": "default", "data_files": [{"split": "train_sft_sft", "path": "data/train_sft_sft-*"}, {"split": "test_sft_sft", "path": "data/test_sft_sft-*"}, {"split": "train_gen_sft", "path": "data/train_gen_sft-*"}, {"split": "test_gen_sft", "path": "data/test_gen_sft-*"}]}]} | 2024-01-23T21:03:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "self-reward-dev1706043619"
More Information needed | [
"# Dataset Card for \"self-reward-dev1706043619\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"self-reward-dev1706043619\"\n\nMore Information needed"
] |
38408872e8288c7ae329ed588b80f715909b0390 |
A collection of translations from Portuguese do Angrarosskesh, my fictional language. | matjs/pt_to_an | [
"task_categories:translation",
"size_categories:1K<n<10K",
"language:pt",
"license:mit",
"region:us"
] | 2024-01-23T21:24:27+00:00 | {"language": ["pt"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["translation"], "pretty_name": "PT-AN"} | 2024-02-03T14:07:27+00:00 | [] | [
"pt"
] | TAGS
#task_categories-translation #size_categories-1K<n<10K #language-Portuguese #license-mit #region-us
|
A collection of translations from Portuguese do Angrarosskesh, my fictional language. | [] | [
"TAGS\n#task_categories-translation #size_categories-1K<n<10K #language-Portuguese #license-mit #region-us \n"
] |
b0c39e7a26a9bc47917abaae740449c14aff4086 | # lilac/HellaSwag
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/Rowan/hellaswag](https://huggingface.co/datasets/Rowan/hellaswag)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-HellaSwag
```
or from python with:
```py
ll.download("lilacai/lilac-HellaSwag")
```
| lilacai/lilac-HellaSwag | [
"Lilac",
"region:us"
] | 2024-01-23T21:29:20+00:00 | {"tags": ["Lilac"]} | 2024-01-23T21:29:24+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/HellaSwag
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/HellaSwag\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/HellaSwag\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
ab3183a7608d1aede49e5e4db7317d1bd7891246 |
# Dataset Card for Evaluation run of macadeliccc/piccolo-math-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/piccolo-math-2x7b](https://huggingface.co/macadeliccc/piccolo-math-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T21:39:07.430696](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b/blob/main/results_2024-01-23T21-39-07.430696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.64218516330683,
"acc_stderr": 0.03223781750024571,
"acc_norm": 0.6418148031090513,
"acc_norm_stderr": 0.03290014345969884,
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6385532906891974,
"mc2_stderr": 0.01575881107075601
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344
},
"harness|hellaswag|10": {
"acc": 0.7062338179645489,
"acc_stderr": 0.004545552424153376,
"acc_norm": 0.8727345150368453,
"acc_norm_stderr": 0.003325890225529858
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997685,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997685
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266868,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.01625113971157077,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.01625113971157077
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6385532906891974,
"mc2_stderr": 0.01575881107075601
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.01126851997157768
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b | [
"region:us"
] | 2024-01-23T21:41:26+00:00 | {"pretty_name": "Evaluation run of macadeliccc/piccolo-math-2x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/piccolo-math-2x7b](https://huggingface.co/macadeliccc/piccolo-math-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T21:39:07.430696](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-math-2x7b/blob/main/results_2024-01-23T21-39-07.430696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.64218516330683,\n \"acc_stderr\": 0.03223781750024571,\n \"acc_norm\": 0.6418148031090513,\n \"acc_norm_stderr\": 0.03290014345969884,\n \"mc1\": 0.4785801713586291,\n \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6385532906891974,\n \"mc2_stderr\": 0.01575881107075601\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7062338179645489,\n \"acc_stderr\": 0.004545552424153376,\n \"acc_norm\": 0.8727345150368453,\n \"acc_norm_stderr\": 0.003325890225529858\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997685,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997685\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266868,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.01625113971157077,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.01625113971157077\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4785801713586291,\n \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6385532906891974,\n \"mc2_stderr\": 0.01575881107075601\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.01126851997157768\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \"acc_stderr\": 0.012607137125693627\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/piccolo-math-2x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|arc:challenge|25_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|gsm8k|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hellaswag|10_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["**/details_harness|winogrande|5_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T21-39-07.430696.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T21_39_07.430696", "path": ["results_2024-01-23T21-39-07.430696.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T21-39-07.430696.parquet"]}]}]} | 2024-01-23T21:41:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/piccolo-math-2x7b
Dataset automatically created during the evaluation run of model macadeliccc/piccolo-math-2x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T21:39:07.430696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/piccolo-math-2x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/piccolo-math-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T21:39:07.430696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/piccolo-math-2x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/piccolo-math-2x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T21:39:07.430696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8b613e817888c123cace04c0fe7a8516694782c8 | # lilac/HumanEval
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/openai_humaneval](https://huggingface.co/datasets/openai_humaneval)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-HumanEval
```
or from python with:
```py
ll.download("lilacai/lilac-HumanEval")
```
| lilacai/lilac-HumanEval | [
"Lilac",
"region:us"
] | 2024-01-23T21:52:34+00:00 | {"tags": ["Lilac"]} | 2024-01-23T21:53:20+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/HumanEval
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/HumanEval\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/HumanEval\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
bb0c987555b5bf631651100767949f808509be74 |
# Dataset Card for "agieval-gaokao-mathcloze"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.
This dataset contains the contents of the Gaokao-mathcloze subtask of AGIEval, as accessed in https://github.com/ruixiangcui/AGIEval/commit/5c77d073fda993f1652eaae3cf5d04cc5fd21d40 .
Citation:
@misc
{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
} | hails/agieval-gaokao-mathcloze | [
"language:zh",
"arxiv:2304.06364",
"region:us"
] | 2024-01-23T22:03:04+00:00 | {"language": ["zh"], "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 24078, "num_examples": 118}], "download_size": 14715, "dataset_size": 24078}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]} | 2024-01-26T18:28:10+00:00 | [
"2304.06364"
] | [
"zh"
] | TAGS
#language-Chinese #arxiv-2304.06364 #region-us
|
# Dataset Card for "agieval-gaokao-mathcloze"
Dataset taken from URL and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.
This dataset contains the contents of the Gaokao-mathcloze subtask of AGIEval, as accessed in URL .
Citation:
@misc
{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
} | [
"# Dataset Card for \"agieval-gaokao-mathcloze\"\n\n\nDataset taken from URL and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.\n\nThis dataset contains the contents of the Gaokao-mathcloze subtask of AGIEval, as accessed in URL .\n\n\nCitation:\n\n\n@misc\n\n{zhong2023agieval,\ntitle={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},\nauthor={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},\nyear={2023},\neprint={2304.06364},\narchivePrefix={arXiv},\nprimaryClass={cs.CL}\n}"
] | [
"TAGS\n#language-Chinese #arxiv-2304.06364 #region-us \n",
"# Dataset Card for \"agieval-gaokao-mathcloze\"\n\n\nDataset taken from URL and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.\n\nThis dataset contains the contents of the Gaokao-mathcloze subtask of AGIEval, as accessed in URL .\n\n\nCitation:\n\n\n@misc\n\n{zhong2023agieval,\ntitle={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},\nauthor={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},\nyear={2023},\neprint={2304.06364},\narchivePrefix={arXiv},\nprimaryClass={cs.CL}\n}"
] |
de1d112d74e718c2c36d194353aac5eb345fbe79 |
# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-50_75p
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aryanne/sheared-plus-westlake-50_75p](https://huggingface.co/Aryanne/sheared-plus-westlake-50_75p) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T22:04:31.166175](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p/blob/main/results_2024-01-23T22-04-31.166175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2672140448144015,
"acc_stderr": 0.03127543112931,
"acc_norm": 0.26909676356851875,
"acc_norm_stderr": 0.03210076459110669,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.42638955634632064,
"mc2_stderr": 0.014788435851867392
},
"harness|arc:challenge|25": {
"acc": 0.3310580204778157,
"acc_stderr": 0.013752062419817836,
"acc_norm": 0.34044368600682595,
"acc_norm_stderr": 0.013847460518892983
},
"harness|hellaswag|10": {
"acc": 0.4441346345349532,
"acc_stderr": 0.004958537988993581,
"acc_norm": 0.5804620593507269,
"acc_norm_stderr": 0.004924748500639335
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952924,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118355,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118355
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276863,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276863
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114468,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041156,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041156
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654824,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654824
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19747899159663865,
"acc_stderr": 0.025859164122051463,
"acc_norm": 0.19747899159663865,
"acc_norm_stderr": 0.025859164122051463
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20733944954128442,
"acc_stderr": 0.017381415563608664,
"acc_norm": 0.20733944954128442,
"acc_norm_stderr": 0.017381415563608664
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.02742100729539294,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.02742100729539294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.01516202415227844,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.01516202415227844
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2976878612716763,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.2976878612716763,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.02417084087934101,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.02417084087934101
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705474,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2379400260756193,
"acc_stderr": 0.010875700787694228,
"acc_norm": 0.2379400260756193,
"acc_norm_stderr": 0.010875700787694228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.42638955634632064,
"mc2_stderr": 0.014788435851867392
},
"harness|winogrande|5": {
"acc": 0.569060773480663,
"acc_stderr": 0.013917796623335964
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p | [
"region:us"
] | 2024-01-23T22:06:53+00:00 | {"pretty_name": "Evaluation run of Aryanne/sheared-plus-westlake-50_75p", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aryanne/sheared-plus-westlake-50_75p](https://huggingface.co/Aryanne/sheared-plus-westlake-50_75p) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T22:04:31.166175](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p/blob/main/results_2024-01-23T22-04-31.166175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2672140448144015,\n \"acc_stderr\": 0.03127543112931,\n \"acc_norm\": 0.26909676356851875,\n \"acc_norm_stderr\": 0.03210076459110669,\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.42638955634632064,\n \"mc2_stderr\": 0.014788435851867392\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3310580204778157,\n \"acc_stderr\": 0.013752062419817836,\n \"acc_norm\": 0.34044368600682595,\n \"acc_norm_stderr\": 0.013847460518892983\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4441346345349532,\n \"acc_stderr\": 0.004958537988993581,\n \"acc_norm\": 0.5804620593507269,\n \"acc_norm_stderr\": 0.004924748500639335\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952924,\n \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952924\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118355,\n \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118355\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349417,\n \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276863,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276863\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n \"acc_stderr\": 0.02447224384089553,\n \"acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.02447224384089553\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114468,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041156,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041156\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.025859164122051463,\n \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.025859164122051463\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.20733944954128442,\n \"acc_stderr\": 0.017381415563608664,\n \"acc_norm\": 0.20733944954128442,\n \"acc_norm_stderr\": 0.017381415563608664\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n \"acc_stderr\": 0.02742100729539294,\n \"acc_norm\": 0.2264957264957265,\n \"acc_norm_stderr\": 0.02742100729539294\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n \"acc_stderr\": 0.01516202415227844,\n \"acc_norm\": 0.23499361430395913,\n \"acc_norm_stderr\": 0.01516202415227844\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2976878612716763,\n \"acc_stderr\": 0.024617055388677003,\n \"acc_norm\": 0.2976878612716763,\n \"acc_norm_stderr\": 0.024617055388677003\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934101,\n \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934101\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262192,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262192\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705474,\n \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705474\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2379400260756193,\n \"acc_stderr\": 0.010875700787694228,\n \"acc_norm\": 0.2379400260756193,\n \"acc_norm_stderr\": 0.010875700787694228\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411955,\n \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411955\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.42638955634632064,\n \"mc2_stderr\": 0.014788435851867392\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.569060773480663,\n \"acc_stderr\": 0.013917796623335964\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Aryanne/sheared-plus-westlake-50_75p", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["**/details_harness|winogrande|5_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T22-04-31.166175.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T22_04_31.166175", "path": ["results_2024-01-23T22-04-31.166175.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T22-04-31.166175.parquet"]}]}]} | 2024-01-23T22:07:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-50_75p
Dataset automatically created during the evaluation run of model Aryanne/sheared-plus-westlake-50_75p on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T22:04:31.166175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-50_75p\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/sheared-plus-westlake-50_75p on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:04:31.166175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-50_75p\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/sheared-plus-westlake-50_75p on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:04:31.166175(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c147b27102dc0fcc77deb43c7d5b8d5f143fa1d2 | # lilac/mbpp
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/mbpp](https://huggingface.co/datasets/mbpp)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-mbpp
```
or from python with:
```py
ll.download("lilacai/lilac-mbpp")
```
| lilacai/lilac-mbpp | [
"Lilac",
"region:us"
] | 2024-01-23T22:08:36+00:00 | {"tags": ["Lilac"]} | 2024-01-23T22:08:37+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/mbpp
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/mbpp\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/mbpp\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
3ed6518a0a9c4074ef2e2d9965007fdbe6174c8d |
# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-normal
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aryanne/sheared-plus-westlake-normal](https://huggingface.co/Aryanne/sheared-plus-westlake-normal) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-normal",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T22:07:40.446133](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-normal/blob/main/results_2024-01-23T22-07-40.446133.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2758004334882251,
"acc_stderr": 0.03136950403182195,
"acc_norm": 0.2776902120501473,
"acc_norm_stderr": 0.03219984767349852,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.46502873385230303,
"mc2_stderr": 0.01537322529293053
},
"harness|arc:challenge|25": {
"acc": 0.36945392491467577,
"acc_stderr": 0.014104578366491887,
"acc_norm": 0.39761092150170646,
"acc_norm_stderr": 0.014301752223279535
},
"harness|hellaswag|10": {
"acc": 0.536247759410476,
"acc_stderr": 0.004976651989757638,
"acc_norm": 0.7033459470225055,
"acc_norm_stderr": 0.0045584915506736885
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.036906779861372814,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.036906779861372814
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.02694748312149623,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.02694748312149623
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491842,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333338,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333338
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180277,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180277
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.31088082901554404,
"acc_stderr": 0.03340361906276587,
"acc_norm": 0.31088082901554404,
"acc_norm_stderr": 0.03340361906276587
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654835,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654835
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959905,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959905
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671548,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996586,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996586
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869327,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869327
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955914,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4214876033057851,
"acc_stderr": 0.04507732278775094,
"acc_norm": 0.4214876033057851,
"acc_norm_stderr": 0.04507732278775094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822586,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822586
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3162393162393162,
"acc_stderr": 0.030463656747340268,
"acc_norm": 0.3162393162393162,
"acc_norm_stderr": 0.030463656747340268
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.016117318166832265,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.016117318166832265
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.024685316867257796,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.024685316867257796
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.01433352205921789,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.01433352205921789
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31189710610932475,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.31189710610932475,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3271604938271605,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.3271604938271605,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290403,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290403
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2737940026075619,
"acc_stderr": 0.011388612167979387,
"acc_norm": 0.2737940026075619,
"acc_norm_stderr": 0.011388612167979387
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23897058823529413,
"acc_stderr": 0.025905280644893017,
"acc_norm": 0.23897058823529413,
"acc_norm_stderr": 0.025905280644893017
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.018152871051538812,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.018152871051538812
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727756,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27346938775510204,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.27346938775510204,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014666,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014666
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594688,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594688
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.46502873385230303,
"mc2_stderr": 0.01537322529293053
},
"harness|winogrande|5": {
"acc": 0.6353591160220995,
"acc_stderr": 0.013527746622429837
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-normal | [
"region:us"
] | 2024-01-23T22:10:01+00:00 | {"pretty_name": "Evaluation run of Aryanne/sheared-plus-westlake-normal", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aryanne/sheared-plus-westlake-normal](https://huggingface.co/Aryanne/sheared-plus-westlake-normal) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-normal\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T22:07:40.446133](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-normal/blob/main/results_2024-01-23T22-07-40.446133.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2758004334882251,\n \"acc_stderr\": 0.03136950403182195,\n \"acc_norm\": 0.2776902120501473,\n \"acc_norm_stderr\": 0.03219984767349852,\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.46502873385230303,\n \"mc2_stderr\": 0.01537322529293053\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36945392491467577,\n \"acc_stderr\": 0.014104578366491887,\n \"acc_norm\": 0.39761092150170646,\n \"acc_norm_stderr\": 0.014301752223279535\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.536247759410476,\n \"acc_stderr\": 0.004976651989757638,\n \"acc_norm\": 0.7033459470225055,\n \"acc_norm_stderr\": 0.0045584915506736885\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.02694748312149623,\n \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.02694748312149623\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491842,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491842\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03333333333333338,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03333333333333338\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180277,\n \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180277\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124488,\n \"acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124488\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.03340361906276587,\n \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.03340361906276587\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654835,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654835\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959905,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959905\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996586,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996586\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869327,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869327\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955914,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4214876033057851,\n \"acc_stderr\": 0.04507732278775094,\n \"acc_norm\": 0.4214876033057851,\n \"acc_norm_stderr\": 0.04507732278775094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822586,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822586\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3162393162393162,\n \"acc_stderr\": 0.030463656747340268,\n \"acc_norm\": 0.3162393162393162,\n \"acc_norm_stderr\": 0.030463656747340268\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n \"acc_stderr\": 0.016117318166832265,\n \"acc_norm\": 0.2835249042145594,\n \"acc_norm_stderr\": 0.016117318166832265\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.024685316867257796,\n \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.024685316867257796\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.01433352205921789,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.01433352205921789\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31189710610932475,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.31189710610932475,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3271604938271605,\n \"acc_stderr\": 0.026105673861409825,\n \"acc_norm\": 0.3271604938271605,\n \"acc_norm_stderr\": 0.026105673861409825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290403,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290403\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2737940026075619,\n \"acc_stderr\": 0.011388612167979387,\n \"acc_norm\": 0.2737940026075619,\n \"acc_norm_stderr\": 0.011388612167979387\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.23897058823529413,\n \"acc_stderr\": 0.025905280644893017,\n \"acc_norm\": 0.23897058823529413,\n \"acc_norm_stderr\": 0.025905280644893017\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538812,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538812\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.040139645540727756,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.040139645540727756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.27346938775510204,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.27346938775510204,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014666,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014666\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n \"acc_stderr\": 0.03240004825594688,\n \"acc_norm\": 0.22289156626506024,\n \"acc_norm_stderr\": 0.03240004825594688\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.46502873385230303,\n \"mc2_stderr\": 0.01537322529293053\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6353591160220995,\n \"acc_stderr\": 0.013527746622429837\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Aryanne/sheared-plus-westlake-normal", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-07-40.446133.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["**/details_harness|winogrande|5_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T22-07-40.446133.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T22_07_40.446133", "path": ["results_2024-01-23T22-07-40.446133.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T22-07-40.446133.parquet"]}]}]} | 2024-01-23T22:10:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-normal
Dataset automatically created during the evaluation run of model Aryanne/sheared-plus-westlake-normal on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T22:07:40.446133(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-normal\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/sheared-plus-westlake-normal on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:07:40.446133(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-normal\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/sheared-plus-westlake-normal on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:07:40.446133(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3b4728261009ad18236d0dc38e498517db819965 | # Dataset Card for "autotrain-data-autotrain-h1dyg-dlim9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Viggnesh/autotrain-data-autotrain-h1dyg-dlim9 | [
"region:us"
] | 2024-01-23T22:18:51+00:00 | {"dataset_info": {"features": [{"name": "autotrain_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 324, "num_examples": 1}, {"name": "validation", "num_bytes": 324, "num_examples": 1}], "download_size": 6048, "dataset_size": 648}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-23T22:18:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "autotrain-data-autotrain-h1dyg-dlim9"
More Information needed | [
"# Dataset Card for \"autotrain-data-autotrain-h1dyg-dlim9\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"autotrain-data-autotrain-h1dyg-dlim9\"\n\nMore Information needed"
] |
27aa2fdf0a59ad8970a221e017d14bdc0ccd1dd6 |
# Dataset Card for Evaluation run of AiMavenAi/MavenWest
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AiMavenAi/MavenWest](https://huggingface.co/AiMavenAi/MavenWest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AiMavenAi__MavenWest",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T22:26:42.328277](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__MavenWest/blob/main/results_2024-01-23T22-26-42.328277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6519949800012913,
"acc_stderr": 0.03226904531191905,
"acc_norm": 0.651558948732219,
"acc_norm_stderr": 0.032941911568037885,
"mc1": 0.5055079559363526,
"mc1_stderr": 0.01750243899045107,
"mc2": 0.6529155692943942,
"mc2_stderr": 0.015412828995723143
},
"harness|arc:challenge|25": {
"acc": 0.6953924914675768,
"acc_stderr": 0.013449522109932489,
"acc_norm": 0.7158703071672355,
"acc_norm_stderr": 0.013179442447653886
},
"harness|hellaswag|10": {
"acc": 0.7135032861979685,
"acc_stderr": 0.004512002459757957,
"acc_norm": 0.8843855805616411,
"acc_norm_stderr": 0.003191084792793155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461783,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.464804469273743,
"acc_stderr": 0.01668102093107665,
"acc_norm": 0.464804469273743,
"acc_norm_stderr": 0.01668102093107665
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632952,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632952
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5055079559363526,
"mc1_stderr": 0.01750243899045107,
"mc2": 0.6529155692943942,
"mc2_stderr": 0.015412828995723143
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.012757375376754941
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AiMavenAi__MavenWest | [
"region:us"
] | 2024-01-23T22:29:04+00:00 | {"pretty_name": "Evaluation run of AiMavenAi/MavenWest", "dataset_summary": "Dataset automatically created during the evaluation run of model [AiMavenAi/MavenWest](https://huggingface.co/AiMavenAi/MavenWest) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AiMavenAi__MavenWest\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T22:26:42.328277](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__MavenWest/blob/main/results_2024-01-23T22-26-42.328277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6519949800012913,\n \"acc_stderr\": 0.03226904531191905,\n \"acc_norm\": 0.651558948732219,\n \"acc_norm_stderr\": 0.032941911568037885,\n \"mc1\": 0.5055079559363526,\n \"mc1_stderr\": 0.01750243899045107,\n \"mc2\": 0.6529155692943942,\n \"mc2_stderr\": 0.015412828995723143\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.013449522109932489,\n \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653886\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7135032861979685,\n \"acc_stderr\": 0.004512002459757957,\n \"acc_norm\": 0.8843855805616411,\n \"acc_norm_stderr\": 0.003191084792793155\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461783,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461783\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.464804469273743,\n \"acc_stderr\": 0.01668102093107665,\n \"acc_norm\": 0.464804469273743,\n \"acc_norm_stderr\": 0.01668102093107665\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632952,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632952\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5055079559363526,\n \"mc1_stderr\": 0.01750243899045107,\n \"mc2\": 0.6529155692943942,\n \"mc2_stderr\": 0.015412828995723143\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \"acc_stderr\": 0.012757375376754941\n }\n}\n```", "repo_url": "https://huggingface.co/AiMavenAi/MavenWest", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-26-42.328277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["**/details_harness|winogrande|5_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T22-26-42.328277.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T22_26_42.328277", "path": ["results_2024-01-23T22-26-42.328277.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T22-26-42.328277.parquet"]}]}]} | 2024-01-23T22:29:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AiMavenAi/MavenWest
Dataset automatically created during the evaluation run of model AiMavenAi/MavenWest on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T22:26:42.328277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AiMavenAi/MavenWest\n\n\n\nDataset automatically created during the evaluation run of model AiMavenAi/MavenWest on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:26:42.328277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AiMavenAi/MavenWest\n\n\n\nDataset automatically created during the evaluation run of model AiMavenAi/MavenWest on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:26:42.328277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
fbf8b94c674ea97e6adf9836e7f626f3ad3380ab |
# Description:
[paper](https://arxiv.org/abs/2012.12453) | [kaggle](https://www.kaggle.com/datasets/newslab/cholecseg8k)
The CholecSeg8k dataset, an extension of the Cholec80 collection, includes 8,080 carefully annotated images from laparoscopic cholecystectomy surgeries, selected from 17 video clips in Cholec80. Each image in CholecSeg8K is pixel-level annotated for thirteen different surgical elements. The dataset is efficiently organized in a directory structure, featuring 101 folders, each containing 80 frames at a resolution of 854x480, along with three types of masks for each frame: a color mask for visualization, an annotation tool mask, and a watershed mask for simplified processing. This comprehensive dataset, freely available under the CC BY-NC-SA 4.0 license, is a critical resource for advancing the field of computer-assisted surgical procedures.
# Loading the data:
First install the `datasets` library, then run the following code,
```python
from datasets import load_dataset
dataset = load_dataset("minwoosun/CholecSeg8k", trust_remote_code=True)
```
# Simple demo:
This short demo shows how to load the data and directly visualize an image along with the corresponding masks.
```python
from datasets import load_dataset
import matplotlib.pyplot as plt
dataset = load_dataset("minwoosun/CholecSeg8k", trust_remote_code=True)
def display_image(dataset, image_index):
'''Display the image and corresponding three masks.'''
fig, axs = plt.subplots(2, 2, figsize=(10, 10))
for ax in axs.flat:
ax.axis('off')
# Display each image in its respective subplot
axs[0, 0].imshow(dataset['train'][image_index]['image'])
axs[0, 1].imshow(dataset['train'][image_index]['color_mask'])
axs[1, 0].imshow(dataset['train'][image_index]['watershed_mask'])
axs[1, 1].imshow(dataset['train'][image_index]['annotation_mask'])
# Adjust spacing between images
plt.subplots_adjust(wspace=0.01, hspace=-0.6)
plt.show()
display_image(dataset, 800) # video index from 0 to 8079
```

# Citation (BibTex):
```
@misc{hong2020cholecseg8k,
title={CholecSeg8k: A Semantic Segmentation Dataset for Laparoscopic Cholecystectomy Based on Cholec80},
author={W. -Y. Hong and C. -L. Kao and Y. -H. Kuo and J. -R. Wang and W. -L. Chang and C. -S. Shih},
year={2020},
eprint={2012.12453},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
# Data card contact:
Min Woo Sun ([email protected])
| minwoosun/CholecSeg8k | [
"task_categories:image-segmentation",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-nc-sa-4.0",
"medical",
"biology",
"arxiv:2012.12453",
"region:us"
] | 2024-01-23T22:38:14+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["image-segmentation"], "pretty_name": "CholecSeg8k", "tags": ["medical", "biology"]} | 2024-01-25T19:30:13+00:00 | [
"2012.12453"
] | [
"en"
] | TAGS
#task_categories-image-segmentation #size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #medical #biology #arxiv-2012.12453 #region-us
|
# Description:
paper | kaggle
The CholecSeg8k dataset, an extension of the Cholec80 collection, includes 8,080 carefully annotated images from laparoscopic cholecystectomy surgeries, selected from 17 video clips in Cholec80. Each image in CholecSeg8K is pixel-level annotated for thirteen different surgical elements. The dataset is efficiently organized in a directory structure, featuring 101 folders, each containing 80 frames at a resolution of 854x480, along with three types of masks for each frame: a color mask for visualization, an annotation tool mask, and a watershed mask for simplified processing. This comprehensive dataset, freely available under the CC BY-NC-SA 4.0 license, is a critical resource for advancing the field of computer-assisted surgical procedures.
# Loading the data:
First install the 'datasets' library, then run the following code,
# Simple demo:
This short demo shows how to load the data and directly visualize an image along with the corresponding masks.
!example image
(BibTex):
# Data card contact:
Min Woo Sun (minwoos@URL)
| [
"# Description:\n\npaper | kaggle\n\nThe CholecSeg8k dataset, an extension of the Cholec80 collection, includes 8,080 carefully annotated images from laparoscopic cholecystectomy surgeries, selected from 17 video clips in Cholec80. Each image in CholecSeg8K is pixel-level annotated for thirteen different surgical elements. The dataset is efficiently organized in a directory structure, featuring 101 folders, each containing 80 frames at a resolution of 854x480, along with three types of masks for each frame: a color mask for visualization, an annotation tool mask, and a watershed mask for simplified processing. This comprehensive dataset, freely available under the CC BY-NC-SA 4.0 license, is a critical resource for advancing the field of computer-assisted surgical procedures.",
"# Loading the data:\n\nFirst install the 'datasets' library, then run the following code,",
"# Simple demo:\n\nThis short demo shows how to load the data and directly visualize an image along with the corresponding masks.\n\n\n!example image\n\n(BibTex):",
"# Data card contact:\nMin Woo Sun (minwoos@URL)"
] | [
"TAGS\n#task_categories-image-segmentation #size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #medical #biology #arxiv-2012.12453 #region-us \n",
"# Description:\n\npaper | kaggle\n\nThe CholecSeg8k dataset, an extension of the Cholec80 collection, includes 8,080 carefully annotated images from laparoscopic cholecystectomy surgeries, selected from 17 video clips in Cholec80. Each image in CholecSeg8K is pixel-level annotated for thirteen different surgical elements. The dataset is efficiently organized in a directory structure, featuring 101 folders, each containing 80 frames at a resolution of 854x480, along with three types of masks for each frame: a color mask for visualization, an annotation tool mask, and a watershed mask for simplified processing. This comprehensive dataset, freely available under the CC BY-NC-SA 4.0 license, is a critical resource for advancing the field of computer-assisted surgical procedures.",
"# Loading the data:\n\nFirst install the 'datasets' library, then run the following code,",
"# Simple demo:\n\nThis short demo shows how to load the data and directly visualize an image along with the corresponding masks.\n\n\n!example image\n\n(BibTex):",
"# Data card contact:\nMin Woo Sun (minwoos@URL)"
] |
c2abf7d280cbcd8a2c2f8b8285eeca90194e3669 | # lilac/TruthfulQA-Generation
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/truthful_qa](https://huggingface.co/datasets/truthful_qa)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-TruthfulQA-Generation
```
or from python with:
```py
ll.download("lilacai/lilac-TruthfulQA-Generation")
```
| lilacai/lilac-TruthfulQA-Generation | [
"Lilac",
"region:us"
] | 2024-01-23T22:44:21+00:00 | {"tags": ["Lilac"]} | 2024-01-23T22:54:29+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/TruthfulQA-Generation
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/TruthfulQA-Generation\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/TruthfulQA-Generation\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
a8df57ed10c8cbb64458243cd427696fbf8d167b |
# Dataset Card for Evaluation run of snorkelai/Snorkel-Mistral-PairRM-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [snorkelai/Snorkel-Mistral-PairRM-DPO](https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T22:58:16.108311](https://huggingface.co/datasets/open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO/blob/main/results_2024-01-23T22-58-16.108311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.608528367712073,
"acc_stderr": 0.03310128621962333,
"acc_norm": 0.6136104634173836,
"acc_norm_stderr": 0.033774611688968344,
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.709120958732846,
"mc2_stderr": 0.015029348692083801
},
"harness|arc:challenge|25": {
"acc": 0.6160409556313993,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892973
},
"harness|hellaswag|10": {
"acc": 0.6795459071898028,
"acc_stderr": 0.004656974162147999,
"acc_norm": 0.8563035251941844,
"acc_norm_stderr": 0.0035006479678795772
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849723,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849723
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562417,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250948,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250948
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929189,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929189
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175371,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333555,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.01599564494729923,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.01599564494729923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119545,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935729,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935729
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954843,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954843
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.709120958732846,
"mc2_stderr": 0.015029348692083801
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774102
},
"harness|gsm8k|5": {
"acc": 0.3616376042456406,
"acc_stderr": 0.01323465835108877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO | [
"region:us"
] | 2024-01-23T22:52:02+00:00 | {"pretty_name": "Evaluation run of snorkelai/Snorkel-Mistral-PairRM-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [snorkelai/Snorkel-Mistral-PairRM-DPO](https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T22:58:16.108311](https://huggingface.co/datasets/open-llm-leaderboard/details_snorkelai__Snorkel-Mistral-PairRM-DPO/blob/main/results_2024-01-23T22-58-16.108311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.608528367712073,\n \"acc_stderr\": 0.03310128621962333,\n \"acc_norm\": 0.6136104634173836,\n \"acc_norm_stderr\": 0.033774611688968344,\n \"mc1\": 0.5495716034271726,\n \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.709120958732846,\n \"mc2_stderr\": 0.015029348692083801\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892973\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6795459071898028,\n \"acc_stderr\": 0.004656974162147999,\n \"acc_norm\": 0.8563035251941844,\n \"acc_norm_stderr\": 0.0035006479678795772\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849723,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849723\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562417,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.03252113489929189,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.03252113489929189\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.02336505149175371,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.02336505149175371\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n \"acc_stderr\": 0.014836205167333555,\n \"acc_norm\": 0.7790549169859514,\n \"acc_norm_stderr\": 0.014836205167333555\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n \"acc_stderr\": 0.01599564494729923,\n \"acc_norm\": 0.3541899441340782,\n \"acc_norm_stderr\": 0.01599564494729923\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119545,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n \"acc_stderr\": 0.012669813464935729,\n \"acc_norm\": 0.43741851368970014,\n \"acc_norm_stderr\": 0.012669813464935729\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954843,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954843\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.709120958732846,\n \"mc2_stderr\": 0.015029348692083801\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774102\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3616376042456406,\n \"acc_stderr\": 0.01323465835108877\n }\n}\n```", "repo_url": "https://huggingface.co/snorkelai/Snorkel-Mistral-PairRM-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-49-34.366391.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["**/details_harness|winogrande|5_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["**/details_harness|winogrande|5_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T22-58-16.108311.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T22_49_34.366391", "path": ["results_2024-01-23T22-49-34.366391.parquet"]}, {"split": "2024_01_23T22_58_16.108311", "path": ["results_2024-01-23T22-58-16.108311.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T22-58-16.108311.parquet"]}]}]} | 2024-01-23T23:00:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of snorkelai/Snorkel-Mistral-PairRM-DPO
Dataset automatically created during the evaluation run of model snorkelai/Snorkel-Mistral-PairRM-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T22:58:16.108311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of snorkelai/Snorkel-Mistral-PairRM-DPO\n\n\n\nDataset automatically created during the evaluation run of model snorkelai/Snorkel-Mistral-PairRM-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:58:16.108311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of snorkelai/Snorkel-Mistral-PairRM-DPO\n\n\n\nDataset automatically created during the evaluation run of model snorkelai/Snorkel-Mistral-PairRM-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T22:58:16.108311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e7c0581bb2451251923895d7d0e53f75b7fd9035 | # England NHS GP Reviews (2022 - 2024)
<!-- Provide a quick summary of the dataset. -->
England NHS GP Reviews (2022 - 2024) Scrapped from https://www.nhs.uk/service-search/find-a-gp
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
England NHS GP Reviews (2022-2024) Scraped from https://www.nhs.uk/service-search/find-a-gp
This dataset contains reviews of GP surgeries across England scraped from the NHS website. Each GP surgery is identified by an ODS code and surgery name. The scraped data includes the first 7 pages of reviews for each surgery, capturing the following attributes:
- Surgery ODE Code
- Surgery name
- Scrape URL
- Review Title
- Star Rating (1-5)
- Review Comment Text
- Date of Review (Month and Year)
In total the dataset covers GP surgery reviews posted between 2022-2024. Over 61,000 individual reviews have been gathered providing insight into patient experiences with GP surgeries relating to aspects like appointments, staff, facilities and overall service.
The data is intended to enable further analysis into the quality of GP surgeries based on patient reviews submitted to the official NHS platforms. It can facilitate identification of review trends and top performing GP surgeries based on review metrics like average ratings and most frequent positive/negative review topics.
The reviews have been gathered through web scraping the NHS public websites, but are not officially endorsed NHS data products. The provided reviews should be considered assertions by individual anonymous patients regarding their experience with the listed GP surgery. Personal information has been removed during the scraping process to protect patient privacy.
- **Curated by:** Jan du Plessis
- **Language(s) (NLP):** English
- **License:** UK open-government-license
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The dataset is intended for use with BERTopic for topic analysis.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
NLP ML + Deep Learining Projects
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
- Surgery ODE Code
- Surgery name
- Scrape URL
- Review Title
- Star Rating (1-5)
- Review Comment Text
- Date of Review (Month and Year)
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Traininig a model to classify medical reviews.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
https://www.nhs.uk/service-search/find-a-gp
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
Web Scrapping using BeautifulSoup.
Duplicates has been removed and NAN dropped.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
Reviews are anonymise with not patient identifyable information captured.
## Dataset Card Authors [optional]
Jan du Plessis
## Dataset Card Contact
[email protected] | janduplessis886/england-nhs-gp-reviews | [
"language:en",
"license:other",
"reviews",
"medical",
"gp",
"nhs",
"region:us"
] | 2024-01-23T22:52:45+00:00 | {"language": ["en"], "license": "other", "pretty_name": "England NHS GP Reviews (2022 - 2024)", "license_name": "open-government-license", "license_link": "https://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/", "tags": ["reviews", "medical", "gp", "nhs"]} | 2024-01-24T00:54:15+00:00 | [] | [
"en"
] | TAGS
#language-English #license-other #reviews #medical #gp #nhs #region-us
| # England NHS GP Reviews (2022 - 2024)
England NHS GP Reviews (2022 - 2024) Scrapped from URL
## Dataset Details
### Dataset Description
England NHS GP Reviews (2022-2024) Scraped from URL
This dataset contains reviews of GP surgeries across England scraped from the NHS website. Each GP surgery is identified by an ODS code and surgery name. The scraped data includes the first 7 pages of reviews for each surgery, capturing the following attributes:
- Surgery ODE Code
- Surgery name
- Scrape URL
- Review Title
- Star Rating (1-5)
- Review Comment Text
- Date of Review (Month and Year)
In total the dataset covers GP surgery reviews posted between 2022-2024. Over 61,000 individual reviews have been gathered providing insight into patient experiences with GP surgeries relating to aspects like appointments, staff, facilities and overall service.
The data is intended to enable further analysis into the quality of GP surgeries based on patient reviews submitted to the official NHS platforms. It can facilitate identification of review trends and top performing GP surgeries based on review metrics like average ratings and most frequent positive/negative review topics.
The reviews have been gathered through web scraping the NHS public websites, but are not officially endorsed NHS data products. The provided reviews should be considered assertions by individual anonymous patients regarding their experience with the listed GP surgery. Personal information has been removed during the scraping process to protect patient privacy.
- Curated by: Jan du Plessis
- Language(s) (NLP): English
- License: UK open-government-license
### Dataset Sources [optional]
## Uses
The dataset is intended for use with BERTopic for topic analysis.
### Direct Use
NLP ML + Deep Learining Projects
## Dataset Structure
- Surgery ODE Code
- Surgery name
- Scrape URL
- Review Title
- Star Rating (1-5)
- Review Comment Text
- Date of Review (Month and Year)
## Dataset Creation
### Curation Rationale
Traininig a model to classify medical reviews.
### Source Data
URL
#### Data Collection and Processing
Web Scrapping using BeautifulSoup.
Duplicates has been removed and NAN dropped.
#### Personal and Sensitive Information
Reviews are anonymise with not patient identifyable information captured.
## Dataset Card Authors [optional]
Jan du Plessis
## Dataset Card Contact
drjanduplessis@URL | [
"# England NHS GP Reviews (2022 - 2024)\n\n\n\nEngland NHS GP Reviews (2022 - 2024) Scrapped from URL",
"## Dataset Details",
"### Dataset Description\n\n\n\nEngland NHS GP Reviews (2022-2024) Scraped from URL\n\nThis dataset contains reviews of GP surgeries across England scraped from the NHS website. Each GP surgery is identified by an ODS code and surgery name. The scraped data includes the first 7 pages of reviews for each surgery, capturing the following attributes:\n\n- Surgery ODE Code\n- Surgery name\n- Scrape URL\n- Review Title\n- Star Rating (1-5)\n- Review Comment Text\n- Date of Review (Month and Year)\n\nIn total the dataset covers GP surgery reviews posted between 2022-2024. Over 61,000 individual reviews have been gathered providing insight into patient experiences with GP surgeries relating to aspects like appointments, staff, facilities and overall service.\n\nThe data is intended to enable further analysis into the quality of GP surgeries based on patient reviews submitted to the official NHS platforms. It can facilitate identification of review trends and top performing GP surgeries based on review metrics like average ratings and most frequent positive/negative review topics.\n\nThe reviews have been gathered through web scraping the NHS public websites, but are not officially endorsed NHS data products. The provided reviews should be considered assertions by individual anonymous patients regarding their experience with the listed GP surgery. Personal information has been removed during the scraping process to protect patient privacy.\n\n- Curated by: Jan du Plessis\n- Language(s) (NLP): English\n- License: UK open-government-license",
"### Dataset Sources [optional]",
"## Uses\n\n\nThe dataset is intended for use with BERTopic for topic analysis.",
"### Direct Use\n\n\nNLP ML + Deep Learining Projects",
"## Dataset Structure\n\n\n\n- Surgery ODE Code\n- Surgery name\n- Scrape URL\n- Review Title\n- Star Rating (1-5)\n- Review Comment Text\n- Date of Review (Month and Year)",
"## Dataset Creation",
"### Curation Rationale\n\n\nTraininig a model to classify medical reviews.",
"### Source Data\n\n\nURL",
"#### Data Collection and Processing\n\n\nWeb Scrapping using BeautifulSoup.\nDuplicates has been removed and NAN dropped.",
"#### Personal and Sensitive Information\n\n\nReviews are anonymise with not patient identifyable information captured.",
"## Dataset Card Authors [optional]\nJan du Plessis",
"## Dataset Card Contact\ndrjanduplessis@URL"
] | [
"TAGS\n#language-English #license-other #reviews #medical #gp #nhs #region-us \n",
"# England NHS GP Reviews (2022 - 2024)\n\n\n\nEngland NHS GP Reviews (2022 - 2024) Scrapped from URL",
"## Dataset Details",
"### Dataset Description\n\n\n\nEngland NHS GP Reviews (2022-2024) Scraped from URL\n\nThis dataset contains reviews of GP surgeries across England scraped from the NHS website. Each GP surgery is identified by an ODS code and surgery name. The scraped data includes the first 7 pages of reviews for each surgery, capturing the following attributes:\n\n- Surgery ODE Code\n- Surgery name\n- Scrape URL\n- Review Title\n- Star Rating (1-5)\n- Review Comment Text\n- Date of Review (Month and Year)\n\nIn total the dataset covers GP surgery reviews posted between 2022-2024. Over 61,000 individual reviews have been gathered providing insight into patient experiences with GP surgeries relating to aspects like appointments, staff, facilities and overall service.\n\nThe data is intended to enable further analysis into the quality of GP surgeries based on patient reviews submitted to the official NHS platforms. It can facilitate identification of review trends and top performing GP surgeries based on review metrics like average ratings and most frequent positive/negative review topics.\n\nThe reviews have been gathered through web scraping the NHS public websites, but are not officially endorsed NHS data products. The provided reviews should be considered assertions by individual anonymous patients regarding their experience with the listed GP surgery. Personal information has been removed during the scraping process to protect patient privacy.\n\n- Curated by: Jan du Plessis\n- Language(s) (NLP): English\n- License: UK open-government-license",
"### Dataset Sources [optional]",
"## Uses\n\n\nThe dataset is intended for use with BERTopic for topic analysis.",
"### Direct Use\n\n\nNLP ML + Deep Learining Projects",
"## Dataset Structure\n\n\n\n- Surgery ODE Code\n- Surgery name\n- Scrape URL\n- Review Title\n- Star Rating (1-5)\n- Review Comment Text\n- Date of Review (Month and Year)",
"## Dataset Creation",
"### Curation Rationale\n\n\nTraininig a model to classify medical reviews.",
"### Source Data\n\n\nURL",
"#### Data Collection and Processing\n\n\nWeb Scrapping using BeautifulSoup.\nDuplicates has been removed and NAN dropped.",
"#### Personal and Sensitive Information\n\n\nReviews are anonymise with not patient identifyable information captured.",
"## Dataset Card Authors [optional]\nJan du Plessis",
"## Dataset Card Contact\ndrjanduplessis@URL"
] |
f25c2c38e5f794d56cffe8fd9c586c010eeb38d8 | # vogue-runway-top15-512px-nobg-embeddings
[Vogue Runway](https://www.vogue.com/fashion-shows)
- 15 fashion houses
- 1679 collections
- 87,547 images
Fashion Houses: Alexander McQueen, Armani, Balenciaga, Calvin Klein, Chanel, Dior, Fendi, Gucci, Hermes, Louis Vuitton, Prada, Ralph Lauren, Saint Laurent, Valentino, Versace.
Images are maximum height 512 pixels.
Background is removed using [mattmdjaga/segformer_b2_clothes](https://huggingface.co/mattmdjaga/segformer_b2_clothes).
Embeddings generated with [tonyassi/vogue-fashion-collection-15-nobg](https://huggingface.co/tonyassi/vogue-fashion-collection-15-nobg). | tonyassi/vogue-runway-top15-512px-nobg-embeddings | [
"region:us"
] | 2024-01-23T22:56:07+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "alexander mcqueen,fall 1996 ready to wear", "1": "alexander mcqueen,fall 1997 ready to wear", "2": "alexander mcqueen,fall 1998 ready to wear", "3": "alexander mcqueen,fall 1999 ready to wear", "4": "alexander mcqueen,fall 2000 ready to wear", "5": "alexander mcqueen,fall 2001 ready to wear", "6": "alexander mcqueen,fall 2002 ready to wear", "7": "alexander mcqueen,fall 2003 ready to wear", "8": "alexander mcqueen,fall 2004 ready to wear", "9": "alexander mcqueen,fall 2005 menswear", "10": "alexander mcqueen,fall 2005 ready to wear", "11": "alexander mcqueen,fall 2006 menswear", "12": "alexander mcqueen,fall 2006 ready to wear", "13": "alexander mcqueen,fall 2007 menswear", "14": "alexander mcqueen,fall 2007 ready to wear", "15": "alexander mcqueen,fall 2008 menswear", "16": "alexander mcqueen,fall 2008 ready to wear", "17": "alexander mcqueen,fall 2009 ready to wear", "18": "alexander mcqueen,fall 2010 menswear", "19": "alexander mcqueen,fall 2010 ready to wear", "20": "alexander mcqueen,fall 2011 menswear", "21": "alexander mcqueen,fall 2011 ready to wear", "22": "alexander mcqueen,fall 2012 menswear", "23": "alexander mcqueen,fall 2012 ready to wear", "24": "alexander mcqueen,fall 2013 menswear", "25": "alexander mcqueen,fall 2013 ready to wear", "26": "alexander mcqueen,fall 2014 menswear", "27": "alexander mcqueen,fall 2014 ready to wear", "28": "alexander mcqueen,fall 2015 menswear", "29": "alexander mcqueen,fall 2015 ready to wear", "30": "alexander mcqueen,fall 2016 menswear", "31": "alexander mcqueen,fall 2016 ready to wear", "32": "alexander mcqueen,fall 2017 menswear", "33": "alexander mcqueen,fall 2017 ready to wear", "34": "alexander mcqueen,fall 2018 menswear", "35": "alexander mcqueen,fall 2018 ready to wear", "36": "alexander mcqueen,fall 2019 menswear", "37": "alexander mcqueen,fall 2019 ready to wear", "38": "alexander mcqueen,fall 2020 menswear", "39": "alexander mcqueen,fall 2020 ready to wear", "40": "alexander mcqueen,fall 2021 menswear", "41": "alexander mcqueen,fall 2021 ready to wear", "42": "alexander mcqueen,fall 2022 menswear", "43": "alexander mcqueen,fall 2022 ready to wear", "44": "alexander mcqueen,fall 2023 menswear", "45": "alexander mcqueen,fall 2023 ready to wear", "46": "alexander mcqueen,pre fall 2009", "47": "alexander mcqueen,pre fall 2011", "48": "alexander mcqueen,pre fall 2012", "49": "alexander mcqueen,pre fall 2013", "50": "alexander mcqueen,pre fall 2014", "51": "alexander mcqueen,pre fall 2015", "52": "alexander mcqueen,pre fall 2016", "53": "alexander mcqueen,pre fall 2017", "54": "alexander mcqueen,pre fall 2018", "55": "alexander mcqueen,pre fall 2019", "56": "alexander mcqueen,pre fall 2020", "57": "alexander mcqueen,pre fall 2021", "58": "alexander mcqueen,pre fall 2021 menswear", "59": "alexander mcqueen,pre fall 2022", "60": "alexander mcqueen,pre fall 2023", "61": "alexander mcqueen,resort 2009", "62": "alexander mcqueen,resort 2010", "63": "alexander mcqueen,resort 2011", "64": "alexander mcqueen,resort 2012", "65": "alexander mcqueen,resort 2013", "66": "alexander mcqueen,resort 2014", "67": "alexander mcqueen,resort 2015", "68": "alexander mcqueen,resort 2016", "69": "alexander mcqueen,resort 2017", "70": "alexander mcqueen,resort 2018", "71": "alexander mcqueen,resort 2019", "72": "alexander mcqueen,resort 2020", "73": "alexander mcqueen,resort 2021", "74": "alexander mcqueen,resort 2022", "75": "alexander mcqueen,resort 2023", "76": "alexander mcqueen,spring 1995 ready to wear", "77": "alexander mcqueen,spring 1996 ready to wear", "78": "alexander mcqueen,spring 1997 ready to wear", "79": "alexander mcqueen,spring 1998 ready to wear", "80": "alexander mcqueen,spring 1999 ready to wear", "81": "alexander mcqueen,spring 2000 ready to wear", "82": "alexander mcqueen,spring 2001 ready to wear", "83": "alexander mcqueen,spring 2002 ready to wear", "84": "alexander mcqueen,spring 2003 ready to wear", "85": "alexander mcqueen,spring 2004 ready to wear", "86": "alexander mcqueen,spring 2005 menswear", "87": "alexander mcqueen,spring 2005 ready to wear", "88": "alexander mcqueen,spring 2006 menswear", "89": "alexander mcqueen,spring 2006 ready to wear", "90": "alexander mcqueen,spring 2007 menswear", "91": "alexander mcqueen,spring 2007 ready to wear", "92": "alexander mcqueen,spring 2008 menswear", "93": "alexander mcqueen,spring 2008 ready to wear", "94": "alexander mcqueen,spring 2009 menswear", "95": "alexander mcqueen,spring 2009 ready to wear", "96": "alexander mcqueen,spring 2010 menswear", "97": "alexander mcqueen,spring 2010 ready to wear", "98": "alexander mcqueen,spring 2011 menswear", "99": "alexander mcqueen,spring 2011 ready to wear", "100": "alexander mcqueen,spring 2012 menswear", "101": "alexander mcqueen,spring 2012 ready to wear", "102": "alexander mcqueen,spring 2013 menswear", "103": "alexander mcqueen,spring 2013 ready to wear", "104": "alexander mcqueen,spring 2014 menswear", "105": "alexander mcqueen,spring 2014 ready to wear", "106": "alexander mcqueen,spring 2015 menswear", "107": "alexander mcqueen,spring 2015 ready to wear", "108": "alexander mcqueen,spring 2016 menswear", "109": "alexander mcqueen,spring 2016 ready to wear", "110": "alexander mcqueen,spring 2017 menswear", "111": "alexander mcqueen,spring 2017 ready to wear", "112": "alexander mcqueen,spring 2018 menswear", "113": "alexander mcqueen,spring 2018 ready to wear", "114": "alexander mcqueen,spring 2019 menswear", "115": "alexander mcqueen,spring 2019 ready to wear", "116": "alexander mcqueen,spring 2020 menswear", "117": "alexander mcqueen,spring 2020 ready to wear", "118": "alexander mcqueen,spring 2021 menswear", "119": "alexander mcqueen,spring 2021 ready to wear", "120": "alexander mcqueen,spring 2022 menswear", "121": "alexander mcqueen,spring 2022 ready to wear", "122": "alexander mcqueen,spring 2023 menswear", "123": "alexander mcqueen,spring 2023 ready to wear", "124": "alexander mcqueen,spring 2024 menswear", "125": "alexander mcqueen,spring 2024 ready to wear", "126": "armani prive,fall 2005 couture", "127": "armani prive,fall 2006 couture", "128": "armani prive,fall 2007 couture", "129": "armani prive,fall 2008 couture", "130": "armani prive,fall 2009 couture", "131": "armani prive,fall 2010 couture", "132": "armani prive,fall 2011 couture", "133": "armani prive,fall 2012 couture", "134": "armani prive,fall 2013 couture", "135": "armani prive,fall 2014 couture", "136": "armani prive,fall 2015 couture", "137": "armani prive,fall 2016 couture", "138": "armani prive,fall 2017 couture", "139": "armani prive,fall 2018 couture", "140": "armani prive,fall 2019 couture", "141": "armani prive,fall 2021 couture", "142": "armani prive,fall 2022 couture", "143": "armani prive,fall 2023 couture", "144": "armani prive,spring 2005 couture", "145": "armani prive,spring 2006 couture", "146": "armani prive,spring 2007 couture", "147": "armani prive,spring 2008 couture", "148": "armani prive,spring 2009 couture", "149": "armani prive,spring 2010 couture", "150": "armani prive,spring 2011 couture", "151": "armani prive,spring 2012 couture", "152": "armani prive,spring 2013 couture", "153": "armani prive,spring 2014 couture", "154": "armani prive,spring 2015 couture", "155": "armani prive,spring 2016 couture", "156": "armani prive,spring 2017 couture", "157": "armani prive,spring 2018 couture", "158": "armani prive,spring 2019 couture", "159": "armani prive,spring 2020 couture", "160": "armani prive,spring 2021 couture", "161": "armani prive,spring 2023 couture", "162": "balenciaga,fall 2000 ready to wear", "163": "balenciaga,fall 2001 ready to wear", "164": "balenciaga,fall 2002 ready to wear", "165": "balenciaga,fall 2003 ready to wear", "166": "balenciaga,fall 2004 ready to wear", "167": "balenciaga,fall 2005 ready to wear", "168": "balenciaga,fall 2006 ready to wear", "169": "balenciaga,fall 2007 menswear", "170": "balenciaga,fall 2007 ready to wear", "171": "balenciaga,fall 2008 ready to wear", "172": "balenciaga,fall 2009 ready to wear", "173": "balenciaga,fall 2010 ready to wear", "174": "balenciaga,fall 2011 menswear", "175": "balenciaga,fall 2011 ready to wear", "176": "balenciaga,fall 2012 menswear", "177": "balenciaga,fall 2012 ready to wear", "178": "balenciaga,fall 2013 menswear", "179": "balenciaga,fall 2013 ready to wear", "180": "balenciaga,fall 2014 menswear", "181": "balenciaga,fall 2014 ready to wear", "182": "balenciaga,fall 2015 menswear", "183": "balenciaga,fall 2015 ready to wear", "184": "balenciaga,fall 2016 ready to wear", "185": "balenciaga,fall 2017 menswear", "186": "balenciaga,fall 2017 ready to wear", "187": "balenciaga,fall 2018 ready to wear", "188": "balenciaga,fall 2019 menswear", "189": "balenciaga,fall 2019 ready to wear", "190": "balenciaga,fall 2020 menswear", "191": "balenciaga,fall 2020 ready to wear", "192": "balenciaga,fall 2021 couture", "193": "balenciaga,fall 2021 menswear", "194": "balenciaga,fall 2021 ready to wear", "195": "balenciaga,fall 2022 couture", "196": "balenciaga,fall 2022 ready to wear", "197": "balenciaga,fall 2023 couture", "198": "balenciaga,fall 2023 ready to wear", "199": "balenciaga,pre fall 2008", "200": "balenciaga,pre fall 2009", "201": "balenciaga,pre fall 2010", "202": "balenciaga,pre fall 2011", "203": "balenciaga,pre fall 2012", "204": "balenciaga,pre fall 2013", "205": "balenciaga,pre fall 2014", "206": "balenciaga,pre fall 2015", "207": "balenciaga,pre fall 2016", "208": "balenciaga,pre fall 2017", "209": "balenciaga,pre fall 2018", "210": "balenciaga,pre fall 2019", "211": "balenciaga,pre fall 2020", "212": "balenciaga,pre fall 2021", "213": "balenciaga,pre fall 2022", "214": "balenciaga,pre fall 2023", "215": "balenciaga,pre fall 2024", "216": "balenciaga,resort 2008", "217": "balenciaga,resort 2009", "218": "balenciaga,resort 2010", "219": "balenciaga,resort 2011", "220": "balenciaga,resort 2012", "221": "balenciaga,resort 2013", "222": "balenciaga,resort 2014", "223": "balenciaga,resort 2015", "224": "balenciaga,resort 2016", "225": "balenciaga,resort 2017", "226": "balenciaga,resort 2018", "227": "balenciaga,resort 2019", "228": "balenciaga,resort 2020", "229": "balenciaga,resort 2021", "230": "balenciaga,resort 2022", "231": "balenciaga,resort 2023", "232": "balenciaga,resort 2024", "233": "balenciaga,spring 1998 ready to wear", "234": "balenciaga,spring 2000 ready to wear", "235": "balenciaga,spring 2001 ready to wear", "236": "balenciaga,spring 2002 ready to wear", "237": "balenciaga,spring 2003 ready to wear", "238": "balenciaga,spring 2004 ready to wear", "239": "balenciaga,spring 2005 ready to wear", "240": "balenciaga,spring 2006 ready to wear", "241": "balenciaga,spring 2007 menswear", "242": "balenciaga,spring 2007 ready to wear", "243": "balenciaga,spring 2008 menswear", "244": "balenciaga,spring 2008 ready to wear", "245": "balenciaga,spring 2009 ready to wear", "246": "balenciaga,spring 2010 ready to wear", "247": "balenciaga,spring 2011 menswear", "248": "balenciaga,spring 2011 ready to wear", "249": "balenciaga,spring 2012 menswear", "250": "balenciaga,spring 2012 ready to wear", "251": "balenciaga,spring 2013 menswear", "252": "balenciaga,spring 2013 ready to wear", "253": "balenciaga,spring 2014 menswear", "254": "balenciaga,spring 2014 ready to wear", "255": "balenciaga,spring 2015 menswear", "256": "balenciaga,spring 2015 ready to wear", "257": "balenciaga,spring 2016 menswear", "258": "balenciaga,spring 2016 ready to wear", "259": "balenciaga,spring 2017 menswear", "260": "balenciaga,spring 2017 ready to wear", "261": "balenciaga,spring 2018 menswear", "262": "balenciaga,spring 2018 ready to wear", "263": "balenciaga,spring 2019 ready to wear", "264": "balenciaga,spring 2020 menswear", "265": "balenciaga,spring 2020 ready to wear", "266": "balenciaga,spring 2021 menswear", "267": "balenciaga,spring 2021 ready to wear", "268": "balenciaga,spring 2022 ready to wear", "269": "balenciaga,spring 2023 ready to wear", "270": "balenciaga,spring 2024 ready to wear", "271": "calvin klein collection,fall 1995 ready to wear", "272": "calvin klein collection,fall 1996 ready to wear", "273": "calvin klein collection,fall 1997 ready to wear", "274": "calvin klein collection,fall 1998 ready to wear", "275": "calvin klein collection,fall 1999 ready to wear", "276": "calvin klein collection,fall 2000 ready to wear", "277": "calvin klein collection,fall 2001 ready to wear", "278": "calvin klein collection,fall 2002 ready to wear", "279": "calvin klein collection,fall 2003 ready to wear", "280": "calvin klein collection,fall 2004 ready to wear", "281": "calvin klein collection,fall 2005 menswear", "282": "calvin klein collection,fall 2005 ready to wear", "283": "calvin klein collection,fall 2006 menswear", "284": "calvin klein collection,fall 2006 ready to wear", "285": "calvin klein collection,fall 2007 menswear", "286": "calvin klein collection,fall 2007 ready to wear", "287": "calvin klein collection,fall 2008 menswear", "288": "calvin klein collection,fall 2008 ready to wear", "289": "calvin klein collection,fall 2009 ready to wear", "290": "calvin klein collection,fall 2010 menswear", "291": "calvin klein collection,fall 2010 ready to wear", "292": "calvin klein collection,fall 2011 menswear", "293": "calvin klein collection,fall 2011 ready to wear", "294": "calvin klein collection,fall 2012 menswear", "295": "calvin klein collection,fall 2012 ready to wear", "296": "calvin klein collection,fall 2013 menswear", "297": "calvin klein collection,fall 2013 ready to wear", "298": "calvin klein collection,fall 2014 menswear", "299": "calvin klein collection,fall 2014 ready to wear", "300": "calvin klein collection,fall 2015 menswear", "301": "calvin klein collection,fall 2015 ready to wear", "302": "calvin klein collection,fall 2016 menswear", "303": "calvin klein collection,fall 2016 ready to wear", "304": "calvin klein collection,pre fall 2008", "305": "calvin klein collection,pre fall 2009", "306": "calvin klein collection,pre fall 2010", "307": "calvin klein collection,pre fall 2011", "308": "calvin klein collection,pre fall 2012", "309": "calvin klein collection,pre fall 2013", "310": "calvin klein collection,pre fall 2014", "311": "calvin klein collection,pre fall 2015", "312": "calvin klein collection,pre fall 2016", "313": "calvin klein collection,resort 2008", "314": "calvin klein collection,resort 2009", "315": "calvin klein collection,resort 2010", "316": "calvin klein collection,resort 2011", "317": "calvin klein collection,resort 2012", "318": "calvin klein collection,resort 2013", "319": "calvin klein collection,resort 2014", "320": "calvin klein collection,resort 2015", "321": "calvin klein collection,resort 2016", "322": "calvin klein collection,resort 2017", "323": "calvin klein collection,spring 1994 ready to wear", "324": "calvin klein collection,spring 1995 ready to wear", "325": "calvin klein collection,spring 1996 ready to wear", "326": "calvin klein collection,spring 1997 ready to wear", "327": "calvin klein collection,spring 1998 ready to wear", "328": "calvin klein collection,spring 1999 ready to wear", "329": "calvin klein collection,spring 2000 ready to wear", "330": "calvin klein collection,spring 2001 ready to wear", "331": "calvin klein collection,spring 2002 ready to wear", "332": "calvin klein collection,spring 2003 ready to wear", "333": "calvin klein collection,spring 2004 ready to wear", "334": "calvin klein collection,spring 2005 menswear", "335": "calvin klein collection,spring 2005 ready to wear", "336": "calvin klein collection,spring 2006 menswear", "337": "calvin klein collection,spring 2006 ready to wear", "338": "calvin klein collection,spring 2007 menswear", "339": "calvin klein collection,spring 2007 ready to wear", "340": "calvin klein collection,spring 2008 menswear", "341": "calvin klein collection,spring 2008 ready to wear", "342": "calvin klein collection,spring 2009 menswear", "343": "calvin klein collection,spring 2009 ready to wear", "344": "calvin klein collection,spring 2010 menswear", "345": "calvin klein collection,spring 2010 ready to wear", "346": "calvin klein collection,spring 2011 menswear", "347": "calvin klein collection,spring 2011 ready to wear", "348": "calvin klein collection,spring 2012 menswear", "349": "calvin klein collection,spring 2012 ready to wear", "350": "calvin klein collection,spring 2013 menswear", "351": "calvin klein collection,spring 2013 ready to wear", "352": "calvin klein collection,spring 2014 menswear", "353": "calvin klein collection,spring 2014 ready to wear", "354": "calvin klein collection,spring 2015 menswear", "355": "calvin klein collection,spring 2015 ready to wear", "356": "calvin klein collection,spring 2016 menswear", "357": "calvin klein collection,spring 2016 ready to wear", "358": "calvin klein collection,spring 2017 menswear", "359": "calvin klein,fall 2017 menswear", "360": "calvin klein,fall 2017 ready to wear", "361": "calvin klein,fall 2018 menswear", "362": "calvin klein,fall 2018 ready to wear", "363": "calvin klein,pre fall 2019", "364": "calvin klein,resort 2019", "365": "calvin klein,spring 2018 menswear", "366": "calvin klein,spring 2018 ready to wear", "367": "calvin klein,spring 2019 menswear", "368": "calvin klein,spring 2019 ready to wear", "369": "chanel,fall 1991 ready to wear", "370": "chanel,fall 1994 ready to wear", "371": "chanel,fall 1995 couture", "372": "chanel,fall 1996 couture", "373": "chanel,fall 1997 couture", "374": "chanel,fall 1999 couture", "375": "chanel,fall 2000 couture", "376": "chanel,fall 2000 ready to wear", "377": "chanel,fall 2002 couture", "378": "chanel,fall 2003 ready to wear", "379": "chanel,fall 2004 couture", "380": "chanel,fall 2004 ready to wear", "381": "chanel,fall 2005 couture", "382": "chanel,fall 2005 ready to wear", "383": "chanel,fall 2006 couture", "384": "chanel,fall 2006 ready to wear", "385": "chanel,fall 2007 couture", "386": "chanel,fall 2007 ready to wear", "387": "chanel,fall 2008 couture", "388": "chanel,fall 2008 ready to wear", "389": "chanel,fall 2009 couture", "390": "chanel,fall 2009 ready to wear", "391": "chanel,fall 2010 couture", "392": "chanel,fall 2010 ready to wear", "393": "chanel,fall 2011 couture", "394": "chanel,fall 2011 ready to wear", "395": "chanel,fall 2012 couture", "396": "chanel,fall 2012 ready to wear", "397": "chanel,fall 2013 couture", "398": "chanel,fall 2013 ready to wear", "399": "chanel,fall 2014 couture", "400": "chanel,fall 2014 ready to wear", "401": "chanel,fall 2015 couture", "402": "chanel,fall 2015 ready to wear", "403": "chanel,fall 2016 couture", "404": "chanel,fall 2016 ready to wear", "405": "chanel,fall 2017 couture", "406": "chanel,fall 2017 ready to wear", "407": "chanel,fall 2018 couture", "408": "chanel,fall 2018 ready to wear", "409": "chanel,fall 2019 couture", "410": "chanel,fall 2019 ready to wear", "411": "chanel,fall 2020 couture", "412": "chanel,fall 2020 ready to wear", "413": "chanel,fall 2021 couture", "414": "chanel,fall 2021 ready to wear", "415": "chanel,fall 2022 couture", "416": "chanel,fall 2022 ready to wear", "417": "chanel,fall 2023 couture", "418": "chanel,fall 2023 ready to wear", "419": "chanel,pre fall 2008", "420": "chanel,pre fall 2009", "421": "chanel,pre fall 2010", "422": "chanel,pre fall 2011", "423": "chanel,pre fall 2012", "424": "chanel,pre fall 2013", "425": "chanel,pre fall 2014", "426": "chanel,pre fall 2015", "427": "chanel,pre fall 2016", "428": "chanel,pre fall 2017", "429": "chanel,pre fall 2018", "430": "chanel,pre fall 2019", "431": "chanel,pre fall 2020", "432": "chanel,pre fall 2021", "433": "chanel,pre fall 2022", "434": "chanel,pre fall 2023", "435": "chanel,pre fall 2024", "436": "chanel,resort 2007", "437": "chanel,resort 2008", "438": "chanel,resort 2009", "439": "chanel,resort 2010", "440": "chanel,resort 2011", "441": "chanel,resort 2012", "442": "chanel,resort 2013", "443": "chanel,resort 2014", "444": "chanel,resort 2015", "445": "chanel,resort 2016", "446": "chanel,resort 2017", "447": "chanel,resort 2018", "448": "chanel,resort 2019", "449": "chanel,resort 2020", "450": "chanel,resort 2021", "451": "chanel,resort 2022", "452": "chanel,resort 2023", "453": "chanel,resort 2024", "454": "chanel,spring 1992 ready to wear", "455": "chanel,spring 1993 couture", "456": "chanel,spring 1993 ready to wear", "457": "chanel,spring 1994 ready to wear", "458": "chanel,spring 1995 ready to wear", "459": "chanel,spring 1996 ready to wear", "460": "chanel,spring 1997 couture", "461": "chanel,spring 1999 couture", "462": "chanel,spring 2001 couture", "463": "chanel,spring 2002 couture", "464": "chanel,spring 2002 ready to wear", "465": "chanel,spring 2003 couture", "466": "chanel,spring 2004 couture", "467": "chanel,spring 2004 ready to wear", "468": "chanel,spring 2005 couture", "469": "chanel,spring 2005 ready to wear", "470": "chanel,spring 2006 couture", "471": "chanel,spring 2006 ready to wear", "472": "chanel,spring 2007 couture", "473": "chanel,spring 2007 ready to wear", "474": "chanel,spring 2008 couture", "475": "chanel,spring 2008 ready to wear", "476": "chanel,spring 2009 couture", "477": "chanel,spring 2009 ready to wear", "478": "chanel,spring 2010 couture", "479": "chanel,spring 2010 ready to wear", "480": "chanel,spring 2011 couture", "481": "chanel,spring 2011 ready to wear", "482": "chanel,spring 2012 couture", "483": "chanel,spring 2012 ready to wear", "484": "chanel,spring 2013 couture", "485": "chanel,spring 2013 ready to wear", "486": "chanel,spring 2014 couture", "487": "chanel,spring 2014 ready to wear", "488": "chanel,spring 2015 couture", "489": "chanel,spring 2015 ready to wear", "490": "chanel,spring 2016 couture", "491": "chanel,spring 2016 ready to wear", "492": "chanel,spring 2017 couture", "493": "chanel,spring 2017 ready to wear", "494": "chanel,spring 2018 couture", "495": "chanel,spring 2018 ready to wear", "496": "chanel,spring 2019 couture", "497": "chanel,spring 2019 ready to wear", "498": "chanel,spring 2020 couture", "499": "chanel,spring 2020 ready to wear", "500": "chanel,spring 2021 couture", "501": "chanel,spring 2021 ready to wear", "502": "chanel,spring 2022 couture", "503": "chanel,spring 2022 ready to wear", "504": "chanel,spring 2023 couture", "505": "chanel,spring 2023 ready to wear", "506": "chanel,spring 2024 ready to wear", "507": "christian dior,fall 1999 couture", "508": "christian dior,fall 2000 couture", "509": "christian dior,fall 2000 ready to wear", "510": "christian dior,fall 2001 couture", "511": "christian dior,fall 2001 ready to wear", "512": "christian dior,fall 2002 couture", "513": "christian dior,fall 2002 ready to wear", "514": "christian dior,fall 2003 couture", "515": "christian dior,fall 2003 ready to wear", "516": "christian dior,fall 2004 couture", "517": "christian dior,fall 2004 ready to wear", "518": "christian dior,fall 2005 couture", "519": "christian dior,fall 2005 ready to wear", "520": "christian dior,fall 2006 couture", "521": "christian dior,fall 2006 ready to wear", "522": "christian dior,fall 2007 couture", "523": "christian dior,fall 2007 ready to wear", "524": "christian dior,fall 2008 couture", "525": "christian dior,fall 2008 ready to wear", "526": "christian dior,fall 2009 couture", "527": "christian dior,fall 2009 ready to wear", "528": "christian dior,fall 2010 couture", "529": "christian dior,fall 2010 menswear", "530": "christian dior,fall 2010 ready to wear", "531": "christian dior,fall 2011 couture", "532": "christian dior,fall 2011 ready to wear", "533": "christian dior,fall 2012 couture", "534": "christian dior,fall 2012 ready to wear", "535": "christian dior,fall 2013 couture", "536": "christian dior,fall 2013 ready to wear", "537": "christian dior,fall 2014 couture", "538": "christian dior,fall 2014 ready to wear", "539": "christian dior,fall 2015 couture", "540": "christian dior,fall 2015 ready to wear", "541": "christian dior,fall 2016 couture", "542": "christian dior,fall 2016 ready to wear", "543": "christian dior,fall 2017 couture", "544": "christian dior,fall 2017 ready to wear", "545": "christian dior,fall 2018 couture", "546": "christian dior,fall 2018 ready to wear", "547": "christian dior,fall 2019 couture", "548": "christian dior,fall 2019 ready to wear", "549": "christian dior,fall 2020 couture", "550": "christian dior,fall 2021 couture", "551": "christian dior,fall 2021 ready to wear", "552": "christian dior,fall 2022 couture", "553": "christian dior,fall 2022 ready to wear", "554": "christian dior,fall 2023 couture", "555": "christian dior,fall 2023 ready to wear", "556": "christian dior,pre fall 2009", "557": "christian dior,pre fall 2010", "558": "christian dior,pre fall 2011", "559": "christian dior,pre fall 2012", "560": "christian dior,pre fall 2013", "561": "christian dior,pre fall 2014", "562": "christian dior,pre fall 2015", "563": "christian dior,pre fall 2016", "564": "christian dior,pre fall 2017", "565": "christian dior,pre fall 2018", "566": "christian dior,pre fall 2019", "567": "christian dior,pre fall 2020", "568": "christian dior,pre fall 2021", "569": "christian dior,pre fall 2022", "570": "christian dior,pre fall 2023", "571": "christian dior,resort 2007", "572": "christian dior,resort 2008", "573": "christian dior,resort 2009", "574": "christian dior,resort 2010", "575": "christian dior,resort 2011", "576": "christian dior,resort 2012", "577": "christian dior,resort 2013", "578": "christian dior,resort 2014", "579": "christian dior,resort 2015", "580": "christian dior,resort 2016", "581": "christian dior,resort 2017", "582": "christian dior,resort 2018", "583": "christian dior,resort 2019", "584": "christian dior,resort 2020", "585": "christian dior,resort 2021", "586": "christian dior,resort 2022", "587": "christian dior,resort 2023", "588": "christian dior,resort 2024", "589": "christian dior,spring 1999 couture", "590": "christian dior,spring 2000 ready to wear", "591": "christian dior,spring 2001 couture", "592": "christian dior,spring 2001 ready to wear", "593": "christian dior,spring 2002 couture", "594": "christian dior,spring 2002 ready to wear", "595": "christian dior,spring 2003 couture", "596": "christian dior,spring 2003 ready to wear", "597": "christian dior,spring 2004 couture", "598": "christian dior,spring 2004 ready to wear", "599": "christian dior,spring 2005 couture", "600": "christian dior,spring 2005 ready to wear", "601": "christian dior,spring 2006 couture", "602": "christian dior,spring 2006 ready to wear", "603": "christian dior,spring 2007 couture", "604": "christian dior,spring 2007 ready to wear", "605": "christian dior,spring 2008 couture", "606": "christian dior,spring 2008 ready to wear", "607": "christian dior,spring 2009 couture", "608": "christian dior,spring 2009 ready to wear", "609": "christian dior,spring 2010 couture", "610": "christian dior,spring 2010 menswear", "611": "christian dior,spring 2010 ready to wear", "612": "christian dior,spring 2011 couture", "613": "christian dior,spring 2011 ready to wear", "614": "christian dior,spring 2012 couture", "615": "christian dior,spring 2012 ready to wear", "616": "christian dior,spring 2013 couture", "617": "christian dior,spring 2013 ready to wear", "618": "christian dior,spring 2014 couture", "619": "christian dior,spring 2014 ready to wear", "620": "christian dior,spring 2015 couture", "621": "christian dior,spring 2015 ready to wear", "622": "christian dior,spring 2016 couture", "623": "christian dior,spring 2016 ready to wear", "624": "christian dior,spring 2017 couture", "625": "christian dior,spring 2017 ready to wear", "626": "christian dior,spring 2018 couture", "627": "christian dior,spring 2018 ready to wear", "628": "christian dior,spring 2019 couture", "629": "christian dior,spring 2019 ready to wear", "630": "christian dior,spring 2020 couture", "631": "christian dior,spring 2020 ready to wear", "632": "christian dior,spring 2021 couture", "633": "christian dior,spring 2021 ready to wear", "634": "christian dior,spring 2022 couture", "635": "christian dior,spring 2022 ready to wear", "636": "christian dior,spring 2023 couture", "637": "christian dior,spring 2023 ready to wear", "638": "christian dior,spring 2024 ready to wear", "639": "fendi,fall 1999 ready to wear", "640": "fendi,fall 2000 ready to wear", "641": "fendi,fall 2001 ready to wear", "642": "fendi,fall 2002 ready to wear", "643": "fendi,fall 2003 ready to wear", "644": "fendi,fall 2004 ready to wear", "645": "fendi,fall 2005 ready to wear", "646": "fendi,fall 2006 ready to wear", "647": "fendi,fall 2007 menswear", "648": "fendi,fall 2007 ready to wear", "649": "fendi,fall 2008 menswear", "650": "fendi,fall 2008 ready to wear", "651": "fendi,fall 2009 ready to wear", "652": "fendi,fall 2010 ready to wear", "653": "fendi,fall 2011 ready to wear", "654": "fendi,fall 2012 menswear", "655": "fendi,fall 2012 ready to wear", "656": "fendi,fall 2013 menswear", "657": "fendi,fall 2013 ready to wear", "658": "fendi,fall 2014 menswear", "659": "fendi,fall 2014 ready to wear", "660": "fendi,fall 2015 couture", "661": "fendi,fall 2015 menswear", "662": "fendi,fall 2015 ready to wear", "663": "fendi,fall 2016 couture", "664": "fendi,fall 2016 menswear", "665": "fendi,fall 2016 ready to wear", "666": "fendi,fall 2017 couture", "667": "fendi,fall 2017 menswear", "668": "fendi,fall 2017 ready to wear", "669": "fendi,fall 2018 couture", "670": "fendi,fall 2018 menswear", "671": "fendi,fall 2018 ready to wear", "672": "fendi,fall 2019 couture", "673": "fendi,fall 2019 menswear", "674": "fendi,fall 2019 ready to wear", "675": "fendi,fall 2020 menswear", "676": "fendi,fall 2020 ready to wear", "677": "fendi,fall 2021 couture", "678": "fendi,fall 2021 menswear", "679": "fendi,fall 2021 ready to wear", "680": "fendi,fall 2022 couture", "681": "fendi,fall 2022 menswear", "682": "fendi,fall 2022 ready to wear", "683": "fendi,fall 2023 couture", "684": "fendi,fall 2023 menswear", "685": "fendi,fall 2023 ready to wear", "686": "fendi,pre fall 2011", "687": "fendi,pre fall 2012", "688": "fendi,pre fall 2013", "689": "fendi,pre fall 2014", "690": "fendi,pre fall 2015", "691": "fendi,pre fall 2016", "692": "fendi,pre fall 2017", "693": "fendi,pre fall 2018", "694": "fendi,pre fall 2019", "695": "fendi,pre fall 2020", "696": "fendi,pre fall 2022", "697": "fendi,resort 2008", "698": "fendi,resort 2009", "699": "fendi,resort 2012", "700": "fendi,resort 2013", "701": "fendi,resort 2014", "702": "fendi,resort 2015", "703": "fendi,resort 2016", "704": "fendi,resort 2017", "705": "fendi,resort 2018", "706": "fendi,resort 2019", "707": "fendi,resort 2020", "708": "fendi,resort 2022", "709": "fendi,resort 2023", "710": "fendi,resort 2024", "711": "fendi,spring 1999 ready to wear", "712": "fendi,spring 2000 ready to wear", "713": "fendi,spring 2001 ready to wear", "714": "fendi,spring 2002 ready to wear", "715": "fendi,spring 2003 ready to wear", "716": "fendi,spring 2004 ready to wear", "717": "fendi,spring 2005 ready to wear", "718": "fendi,spring 2006 ready to wear", "719": "fendi,spring 2007 ready to wear", "720": "fendi,spring 2008 menswear", "721": "fendi,spring 2008 ready to wear", "722": "fendi,spring 2009 menswear", "723": "fendi,spring 2009 ready to wear", "724": "fendi,spring 2010 ready to wear", "725": "fendi,spring 2011 ready to wear", "726": "fendi,spring 2012 ready to wear", "727": "fendi,spring 2013 menswear", "728": "fendi,spring 2013 ready to wear", "729": "fendi,spring 2014 menswear", "730": "fendi,spring 2014 ready to wear", "731": "fendi,spring 2015 menswear", "732": "fendi,spring 2015 ready to wear", "733": "fendi,spring 2016 menswear", "734": "fendi,spring 2016 ready to wear", "735": "fendi,spring 2017 menswear", "736": "fendi,spring 2017 ready to wear", "737": "fendi,spring 2018 menswear", "738": "fendi,spring 2018 ready to wear", "739": "fendi,spring 2019 menswear", "740": "fendi,spring 2019 ready to wear", "741": "fendi,spring 2020 menswear", "742": "fendi,spring 2020 ready to wear", "743": "fendi,spring 2021 couture", "744": "fendi,spring 2021 menswear", "745": "fendi,spring 2021 ready to wear", "746": "fendi,spring 2022 couture", "747": "fendi,spring 2022 menswear", "748": "fendi,spring 2022 ready to wear", "749": "fendi,spring 2023 couture", "750": "fendi,spring 2023 menswear", "751": "fendi,spring 2023 ready to wear", "752": "fendi,spring 2024 menswear", "753": "fendi,spring 2024 ready to wear", "754": "gucci,fall 1995 ready to wear", "755": "gucci,fall 1996 ready to wear", "756": "gucci,fall 2000 ready to wear", "757": "gucci,fall 2001 ready to wear", "758": "gucci,fall 2002 ready to wear", "759": "gucci,fall 2003 ready to wear", "760": "gucci,fall 2004 ready to wear", "761": "gucci,fall 2005 menswear", "762": "gucci,fall 2005 ready to wear", "763": "gucci,fall 2006 menswear", "764": "gucci,fall 2006 ready to wear", "765": "gucci,fall 2007 menswear", "766": "gucci,fall 2007 ready to wear", "767": "gucci,fall 2008 menswear", "768": "gucci,fall 2008 ready to wear", "769": "gucci,fall 2009 ready to wear", "770": "gucci,fall 2010 menswear", "771": "gucci,fall 2010 ready to wear", "772": "gucci,fall 2011 menswear", "773": "gucci,fall 2011 ready to wear", "774": "gucci,fall 2012 menswear", "775": "gucci,fall 2012 ready to wear", "776": "gucci,fall 2013 menswear", "777": "gucci,fall 2013 ready to wear", "778": "gucci,fall 2014 menswear", "779": "gucci,fall 2014 ready to wear", "780": "gucci,fall 2015 menswear", "781": "gucci,fall 2015 ready to wear", "782": "gucci,fall 2016 menswear", "783": "gucci,fall 2016 ready to wear", "784": "gucci,fall 2017 menswear", "785": "gucci,fall 2017 ready to wear", "786": "gucci,fall 2018 menswear", "787": "gucci,fall 2018 ready to wear", "788": "gucci,fall 2019 menswear", "789": "gucci,fall 2019 ready to wear", "790": "gucci,fall 2020 menswear", "791": "gucci,fall 2020 ready to wear", "792": "gucci,fall 2022 ready to wear", "793": "gucci,fall 2023 menswear", "794": "gucci,fall 2023 ready to wear", "795": "gucci,pre fall 2011", "796": "gucci,pre fall 2012", "797": "gucci,pre fall 2013", "798": "gucci,pre fall 2014", "799": "gucci,pre fall 2015", "800": "gucci,pre fall 2016", "801": "gucci,pre fall 2017", "802": "gucci,pre fall 2018", "803": "gucci,pre fall 2019", "804": "gucci,pre fall 2020", "805": "gucci,pre fall 2020 menswear", "806": "gucci,pre fall 2021", "807": "gucci,pre fall 2021 menswear", "808": "gucci,pre fall 2022", "809": "gucci,resort 2007", "810": "gucci,resort 2008", "811": "gucci,resort 2009", "812": "gucci,resort 2010", "813": "gucci,resort 2011", "814": "gucci,resort 2012", "815": "gucci,resort 2013", "816": "gucci,resort 2014", "817": "gucci,resort 2015", "818": "gucci,resort 2016", "819": "gucci,resort 2017", "820": "gucci,resort 2018", "821": "gucci,resort 2019", "822": "gucci,resort 2020", "823": "gucci,resort 2021", "824": "gucci,resort 2023", "825": "gucci,resort 2024", "826": "gucci,spring 1999 ready to wear", "827": "gucci,spring 2000 ready to wear", "828": "gucci,spring 2001 ready to wear", "829": "gucci,spring 2002 ready to wear", "830": "gucci,spring 2003 ready to wear", "831": "gucci,spring 2004 ready to wear", "832": "gucci,spring 2005 menswear", "833": "gucci,spring 2005 ready to wear", "834": "gucci,spring 2006 menswear", "835": "gucci,spring 2006 ready to wear", "836": "gucci,spring 2007 menswear", "837": "gucci,spring 2007 ready to wear", "838": "gucci,spring 2008 menswear", "839": "gucci,spring 2008 ready to wear", "840": "gucci,spring 2009 menswear", "841": "gucci,spring 2009 ready to wear", "842": "gucci,spring 2010 menswear", "843": "gucci,spring 2010 ready to wear", "844": "gucci,spring 2011 menswear", "845": "gucci,spring 2011 ready to wear", "846": "gucci,spring 2012 menswear", "847": "gucci,spring 2012 ready to wear", "848": "gucci,spring 2013 menswear", "849": "gucci,spring 2013 ready to wear", "850": "gucci,spring 2014 menswear", "851": "gucci,spring 2014 ready to wear", "852": "gucci,spring 2015 menswear", "853": "gucci,spring 2015 ready to wear", "854": "gucci,spring 2016 menswear", "855": "gucci,spring 2016 ready to wear", "856": "gucci,spring 2017 menswear", "857": "gucci,spring 2017 ready to wear", "858": "gucci,spring 2018 menswear", "859": "gucci,spring 2018 ready to wear", "860": "gucci,spring 2019 ready to wear", "861": "gucci,spring 2020 menswear", "862": "gucci,spring 2020 ready to wear", "863": "gucci,spring 2021 menswear", "864": "gucci,spring 2021 ready to wear", "865": "gucci,spring 2022 ready to wear", "866": "gucci,spring 2023 ready to wear", "867": "gucci,spring 2024 menswear", "868": "gucci,spring 2024 ready to wear", "869": "hermes,fall 1999 ready to wear", "870": "hermes,fall 2000 ready to wear", "871": "hermes,fall 2001 ready to wear", "872": "hermes,fall 2004 ready to wear", "873": "hermes,fall 2005 menswear", "874": "hermes,fall 2005 ready to wear", "875": "hermes,fall 2006 menswear", "876": "hermes,fall 2006 ready to wear", "877": "hermes,fall 2007 menswear", "878": "hermes,fall 2007 ready to wear", "879": "hermes,fall 2008 menswear", "880": "hermes,fall 2008 ready to wear", "881": "hermes,fall 2009 ready to wear", "882": "hermes,fall 2010 menswear", "883": "hermes,fall 2010 ready to wear", "884": "hermes,fall 2011 menswear", "885": "hermes,fall 2011 ready to wear", "886": "hermes,fall 2012 menswear", "887": "hermes,fall 2012 ready to wear", "888": "hermes,fall 2013 menswear", "889": "hermes,fall 2013 ready to wear", "890": "hermes,fall 2014 menswear", "891": "hermes,fall 2014 ready to wear", "892": "hermes,fall 2015 menswear", "893": "hermes,fall 2015 ready to wear", "894": "hermes,fall 2016 menswear", "895": "hermes,fall 2016 ready to wear", "896": "hermes,fall 2017 menswear", "897": "hermes,fall 2017 ready to wear", "898": "hermes,fall 2018 menswear", "899": "hermes,fall 2018 ready to wear", "900": "hermes,fall 2019 menswear", "901": "hermes,fall 2019 ready to wear", "902": "hermes,fall 2020 menswear", "903": "hermes,fall 2020 ready to wear", "904": "hermes,fall 2021 menswear", "905": "hermes,fall 2021 ready to wear", "906": "hermes,fall 2022 menswear", "907": "hermes,fall 2022 ready to wear", "908": "hermes,fall 2023 menswear", "909": "hermes,fall 2023 ready to wear", "910": "hermes,pre fall 2017", "911": "hermes,pre fall 2018", "912": "hermes,pre fall 2019", "913": "hermes,resort 2017", "914": "hermes,resort 2018", "915": "hermes,resort 2019", "916": "hermes,spring 1999 ready to wear", "917": "hermes,spring 2000 ready to wear", "918": "hermes,spring 2001 ready to wear", "919": "hermes,spring 2002 ready to wear", "920": "hermes,spring 2006 menswear", "921": "hermes,spring 2006 ready to wear", "922": "hermes,spring 2007 menswear", "923": "hermes,spring 2007 ready to wear", "924": "hermes,spring 2008 menswear", "925": "hermes,spring 2008 ready to wear", "926": "hermes,spring 2009 menswear", "927": "hermes,spring 2010 menswear", "928": "hermes,spring 2010 ready to wear", "929": "hermes,spring 2011 menswear", "930": "hermes,spring 2011 ready to wear", "931": "hermes,spring 2012 menswear", "932": "hermes,spring 2012 ready to wear", "933": "hermes,spring 2013 menswear", "934": "hermes,spring 2013 ready to wear", "935": "hermes,spring 2014 menswear", "936": "hermes,spring 2014 ready to wear", "937": "hermes,spring 2015 menswear", "938": "hermes,spring 2015 ready to wear", "939": "hermes,spring 2016 menswear", "940": "hermes,spring 2016 ready to wear", "941": "hermes,spring 2017 menswear", "942": "hermes,spring 2017 ready to wear", "943": "hermes,spring 2018 menswear", "944": "hermes,spring 2018 ready to wear", "945": "hermes,spring 2019 menswear", "946": "hermes,spring 2019 ready to wear", "947": "hermes,spring 2020 menswear", "948": "hermes,spring 2020 ready to wear", "949": "hermes,spring 2021 menswear", "950": "hermes,spring 2021 ready to wear", "951": "hermes,spring 2022 menswear", "952": "hermes,spring 2022 ready to wear", "953": "hermes,spring 2023 menswear", "954": "hermes,spring 2023 ready to wear", "955": "hermes,spring 2024 menswear", "956": "hermes,spring 2024 ready to wear", "957": "louis vuitton,fall 1998 ready to wear", "958": "louis vuitton,fall 2000 ready to wear", "959": "louis vuitton,fall 2001 ready to wear", "960": "louis vuitton,fall 2002 ready to wear", "961": "louis vuitton,fall 2003 ready to wear", "962": "louis vuitton,fall 2004 ready to wear", "963": "louis vuitton,fall 2005 menswear", "964": "louis vuitton,fall 2005 ready to wear", "965": "louis vuitton,fall 2006 menswear", "966": "louis vuitton,fall 2006 ready to wear", "967": "louis vuitton,fall 2007 menswear", "968": "louis vuitton,fall 2008 menswear", "969": "louis vuitton,fall 2008 ready to wear", "970": "louis vuitton,fall 2009 ready to wear", "971": "louis vuitton,fall 2010 menswear", "972": "louis vuitton,fall 2010 ready to wear", "973": "louis vuitton,fall 2011 menswear", "974": "louis vuitton,fall 2011 ready to wear", "975": "louis vuitton,fall 2012 menswear", "976": "louis vuitton,fall 2012 ready to wear", "977": "louis vuitton,fall 2013 menswear", "978": "louis vuitton,fall 2013 ready to wear", "979": "louis vuitton,fall 2014 menswear", "980": "louis vuitton,fall 2014 ready to wear", "981": "louis vuitton,fall 2015 menswear", "982": "louis vuitton,fall 2015 ready to wear", "983": "louis vuitton,fall 2016 menswear", "984": "louis vuitton,fall 2016 ready to wear", "985": "louis vuitton,fall 2017 menswear", "986": "louis vuitton,fall 2017 ready to wear", "987": "louis vuitton,fall 2018 menswear", "988": "louis vuitton,fall 2018 ready to wear", "989": "louis vuitton,fall 2019 menswear", "990": "louis vuitton,fall 2019 ready to wear", "991": "louis vuitton,fall 2020 menswear", "992": "louis vuitton,fall 2020 ready to wear", "993": "louis vuitton,fall 2021 menswear", "994": "louis vuitton,fall 2021 ready to wear", "995": "louis vuitton,fall 2022 menswear", "996": "louis vuitton,fall 2022 ready to wear", "997": "louis vuitton,fall 2023 menswear", "998": "louis vuitton,fall 2023 ready to wear", "999": "louis vuitton,pre fall 2008", "1000": "louis vuitton,pre fall 2009", "1001": "louis vuitton,pre fall 2010", "1002": "louis vuitton,pre fall 2011", "1003": "louis vuitton,pre fall 2012", "1004": "louis vuitton,pre fall 2013", "1005": "louis vuitton,pre fall 2014", "1006": "louis vuitton,pre fall 2015", "1007": "louis vuitton,pre fall 2016", "1008": "louis vuitton,pre fall 2017", "1009": "louis vuitton,pre fall 2018", "1010": "louis vuitton,pre fall 2019", "1011": "louis vuitton,pre fall 2020", "1012": "louis vuitton,pre fall 2020 menswear", "1013": "louis vuitton,pre fall 2021", "1014": "louis vuitton,pre fall 2021 menswear", "1015": "louis vuitton,pre fall 2022 menswear", "1016": "louis vuitton,pre fall 2023", "1017": "louis vuitton,pre fall 2023 menswear", "1018": "louis vuitton,pre fall 2024 menswear", "1019": "louis vuitton,resort 2008", "1020": "louis vuitton,resort 2009", "1021": "louis vuitton,resort 2010", "1022": "louis vuitton,resort 2011", "1023": "louis vuitton,resort 2012", "1024": "louis vuitton,resort 2013", "1025": "louis vuitton,resort 2014", "1026": "louis vuitton,resort 2015", "1027": "louis vuitton,resort 2016", "1028": "louis vuitton,resort 2017", "1029": "louis vuitton,resort 2018", "1030": "louis vuitton,resort 2019", "1031": "louis vuitton,resort 2020", "1032": "louis vuitton,resort 2021", "1033": "louis vuitton,resort 2021 menswear", "1034": "louis vuitton,resort 2022", "1035": "louis vuitton,resort 2022 menswear", "1036": "louis vuitton,resort 2023", "1037": "louis vuitton,resort 2023 menswear", "1038": "louis vuitton,resort 2024", "1039": "louis vuitton,resort 2024 menswear", "1040": "louis vuitton,spring 2000 ready to wear", "1041": "louis vuitton,spring 2001 ready to wear", "1042": "louis vuitton,spring 2002 ready to wear", "1043": "louis vuitton,spring 2003 ready to wear", "1044": "louis vuitton,spring 2004 ready to wear", "1045": "louis vuitton,spring 2005 menswear", "1046": "louis vuitton,spring 2005 ready to wear", "1047": "louis vuitton,spring 2006 menswear", "1048": "louis vuitton,spring 2006 ready to wear", "1049": "louis vuitton,spring 2007 menswear", "1050": "louis vuitton,spring 2007 ready to wear", "1051": "louis vuitton,spring 2008 menswear", "1052": "louis vuitton,spring 2008 ready to wear", "1053": "louis vuitton,spring 2009 menswear", "1054": "louis vuitton,spring 2009 ready to wear", "1055": "louis vuitton,spring 2010 menswear", "1056": "louis vuitton,spring 2010 ready to wear", "1057": "louis vuitton,spring 2011 menswear", "1058": "louis vuitton,spring 2011 ready to wear", "1059": "louis vuitton,spring 2012 menswear", "1060": "louis vuitton,spring 2012 ready to wear", "1061": "louis vuitton,spring 2013 menswear", "1062": "louis vuitton,spring 2013 ready to wear", "1063": "louis vuitton,spring 2014 menswear", "1064": "louis vuitton,spring 2014 ready to wear", "1065": "louis vuitton,spring 2015 menswear", "1066": "louis vuitton,spring 2015 ready to wear", "1067": "louis vuitton,spring 2016 menswear", "1068": "louis vuitton,spring 2016 ready to wear", "1069": "louis vuitton,spring 2017 menswear", "1070": "louis vuitton,spring 2017 ready to wear", "1071": "louis vuitton,spring 2018 menswear", "1072": "louis vuitton,spring 2018 ready to wear", "1073": "louis vuitton,spring 2019 menswear", "1074": "louis vuitton,spring 2019 ready to wear", "1075": "louis vuitton,spring 2020 menswear", "1076": "louis vuitton,spring 2020 ready to wear", "1077": "louis vuitton,spring 2021 menswear", "1078": "louis vuitton,spring 2021 ready to wear", "1079": "louis vuitton,spring 2022 menswear", "1080": "louis vuitton,spring 2023 menswear", "1081": "louis vuitton,spring 2023 ready to wear", "1082": "louis vuitton,spring 2024 menswear", "1083": "prada,fall 1996 ready to wear", "1084": "prada,fall 2000 ready to wear", "1085": "prada,fall 2001 ready to wear", "1086": "prada,fall 2002 ready to wear", "1087": "prada,fall 2003 ready to wear", "1088": "prada,fall 2004 ready to wear", "1089": "prada,fall 2005 menswear", "1090": "prada,fall 2005 ready to wear", "1091": "prada,fall 2006 menswear", "1092": "prada,fall 2006 ready to wear", "1093": "prada,fall 2007 menswear", "1094": "prada,fall 2007 ready to wear", "1095": "prada,fall 2008 menswear", "1096": "prada,fall 2008 ready to wear", "1097": "prada,fall 2009 menswear", "1098": "prada,fall 2009 ready to wear", "1099": "prada,fall 2010 menswear", "1100": "prada,fall 2010 ready to wear", "1101": "prada,fall 2011 menswear", "1102": "prada,fall 2011 ready to wear", "1103": "prada,fall 2012 menswear", "1104": "prada,fall 2012 ready to wear", "1105": "prada,fall 2013 menswear", "1106": "prada,fall 2013 ready to wear", "1107": "prada,fall 2014 menswear", "1108": "prada,fall 2014 ready to wear", "1109": "prada,fall 2015 menswear", "1110": "prada,fall 2015 ready to wear", "1111": "prada,fall 2016 menswear", "1112": "prada,fall 2016 ready to wear", "1113": "prada,fall 2017 menswear", "1114": "prada,fall 2017 ready to wear", "1115": "prada,fall 2018 menswear", "1116": "prada,fall 2018 ready to wear", "1117": "prada,fall 2019 menswear", "1118": "prada,fall 2019 ready to wear", "1119": "prada,fall 2020 menswear", "1120": "prada,fall 2020 ready to wear", "1121": "prada,fall 2021 menswear", "1122": "prada,fall 2021 ready to wear", "1123": "prada,fall 2022 menswear", "1124": "prada,fall 2022 ready to wear", "1125": "prada,fall 2023 menswear", "1126": "prada,fall 2023 ready to wear", "1127": "prada,pre fall 2009", "1128": "prada,pre fall 2010", "1129": "prada,resort 2008", "1130": "prada,resort 2009", "1131": "prada,resort 2010", "1132": "prada,resort 2011", "1133": "prada,resort 2012", "1134": "prada,resort 2013", "1135": "prada,resort 2018", "1136": "prada,resort 2019", "1137": "prada,resort 2020", "1138": "prada,spring 1992 ready to wear", "1139": "prada,spring 1993 ready to wear", "1140": "prada,spring 1994 ready to wear", "1141": "prada,spring 1995 ready to wear", "1142": "prada,spring 1996 ready to wear", "1143": "prada,spring 1997 ready to wear", "1144": "prada,spring 1998 ready to wear", "1145": "prada,spring 1999 ready to wear", "1146": "prada,spring 2000 ready to wear", "1147": "prada,spring 2001 ready to wear", "1148": "prada,spring 2002 ready to wear", "1149": "prada,spring 2003 ready to wear", "1150": "prada,spring 2004 ready to wear", "1151": "prada,spring 2005 menswear", "1152": "prada,spring 2005 ready to wear", "1153": "prada,spring 2006 menswear", "1154": "prada,spring 2006 ready to wear", "1155": "prada,spring 2007 menswear", "1156": "prada,spring 2007 ready to wear", "1157": "prada,spring 2008 menswear", "1158": "prada,spring 2008 ready to wear", "1159": "prada,spring 2009 menswear", "1160": "prada,spring 2009 ready to wear", "1161": "prada,spring 2010 ready to wear", "1162": "prada,spring 2011 menswear", "1163": "prada,spring 2011 ready to wear", "1164": "prada,spring 2012 menswear", "1165": "prada,spring 2012 ready to wear", "1166": "prada,spring 2013 menswear", "1167": "prada,spring 2013 ready to wear", "1168": "prada,spring 2014 menswear", "1169": "prada,spring 2014 ready to wear", "1170": "prada,spring 2015 menswear", "1171": "prada,spring 2015 ready to wear", "1172": "prada,spring 2016 menswear", "1173": "prada,spring 2016 ready to wear", "1174": "prada,spring 2017 menswear", "1175": "prada,spring 2017 ready to wear", "1176": "prada,spring 2018 menswear", "1177": "prada,spring 2018 ready to wear", "1178": "prada,spring 2019 menswear", "1179": "prada,spring 2019 ready to wear", "1180": "prada,spring 2020 menswear", "1181": "prada,spring 2020 ready to wear", "1182": "prada,spring 2021 menswear", "1183": "prada,spring 2021 ready to wear", "1184": "prada,spring 2022 menswear", "1185": "prada,spring 2022 ready to wear", "1186": "prada,spring 2023 menswear", "1187": "prada,spring 2023 ready to wear", "1188": "prada,spring 2024 menswear", "1189": "prada,spring 2024 ready to wear", "1190": "ralph lauren,fall 2000 ready to wear", "1191": "ralph lauren,fall 2001 ready to wear", "1192": "ralph lauren,fall 2002 ready to wear", "1193": "ralph lauren,fall 2003 ready to wear", "1194": "ralph lauren,fall 2004 ready to wear", "1195": "ralph lauren,fall 2005 menswear", "1196": "ralph lauren,fall 2005 ready to wear", "1197": "ralph lauren,fall 2006 menswear", "1198": "ralph lauren,fall 2006 ready to wear", "1199": "ralph lauren,fall 2007 menswear", "1200": "ralph lauren,fall 2007 ready to wear", "1201": "ralph lauren,fall 2008 menswear", "1202": "ralph lauren,fall 2008 ready to wear", "1203": "ralph lauren,fall 2009 ready to wear", "1204": "ralph lauren,fall 2010 menswear", "1205": "ralph lauren,fall 2010 ready to wear", "1206": "ralph lauren,fall 2011 ready to wear", "1207": "ralph lauren,fall 2012 ready to wear", "1208": "ralph lauren,fall 2013 menswear", "1209": "ralph lauren,fall 2013 ready to wear", "1210": "ralph lauren,fall 2014 menswear", "1211": "ralph lauren,fall 2014 ready to wear", "1212": "ralph lauren,fall 2015 menswear", "1213": "ralph lauren,fall 2015 ready to wear", "1214": "ralph lauren,fall 2016 menswear", "1215": "ralph lauren,fall 2016 ready to wear", "1216": "ralph lauren,fall 2017 menswear", "1217": "ralph lauren,fall 2017 ready to wear", "1218": "ralph lauren,fall 2018 menswear", "1219": "ralph lauren,fall 2018 ready to wear", "1220": "ralph lauren,fall 2019 menswear", "1221": "ralph lauren,fall 2019 ready to wear", "1222": "ralph lauren,fall 2020 menswear", "1223": "ralph lauren,fall 2020 ready to wear", "1224": "ralph lauren,fall 2021 ready to wear", "1225": "ralph lauren,fall 2022 ready to wear", "1226": "ralph lauren,fall 2023 ready to wear", "1227": "ralph lauren,pre fall 2014", "1228": "ralph lauren,pre fall 2015", "1229": "ralph lauren,pre fall 2016", "1230": "ralph lauren,pre fall 2017", "1231": "ralph lauren,pre fall 2018", "1232": "ralph lauren,pre fall 2019", "1233": "ralph lauren,pre fall 2020", "1234": "ralph lauren,pre fall 2021", "1235": "ralph lauren,resort 2008", "1236": "ralph lauren,resort 2009", "1237": "ralph lauren,resort 2013", "1238": "ralph lauren,resort 2014", "1239": "ralph lauren,resort 2015", "1240": "ralph lauren,resort 2016", "1241": "ralph lauren,resort 2019", "1242": "ralph lauren,resort 2022", "1243": "ralph lauren,resort 2024", "1244": "ralph lauren,spring 2000 ready to wear", "1245": "ralph lauren,spring 2001 ready to wear", "1246": "ralph lauren,spring 2002 ready to wear", "1247": "ralph lauren,spring 2003 ready to wear", "1248": "ralph lauren,spring 2004 ready to wear", "1249": "ralph lauren,spring 2005 ready to wear", "1250": "ralph lauren,spring 2006 menswear", "1251": "ralph lauren,spring 2006 ready to wear", "1252": "ralph lauren,spring 2007 menswear", "1253": "ralph lauren,spring 2007 ready to wear", "1254": "ralph lauren,spring 2008 menswear", "1255": "ralph lauren,spring 2008 ready to wear", "1256": "ralph lauren,spring 2009 ready to wear", "1257": "ralph lauren,spring 2010 ready to wear", "1258": "ralph lauren,spring 2011 ready to wear", "1259": "ralph lauren,spring 2012 ready to wear", "1260": "ralph lauren,spring 2013 menswear", "1261": "ralph lauren,spring 2013 ready to wear", "1262": "ralph lauren,spring 2014 menswear", "1263": "ralph lauren,spring 2014 ready to wear", "1264": "ralph lauren,spring 2015 menswear", "1265": "ralph lauren,spring 2015 ready to wear", "1266": "ralph lauren,spring 2016 menswear", "1267": "ralph lauren,spring 2016 ready to wear", "1268": "ralph lauren,spring 2017 menswear", "1269": "ralph lauren,spring 2017 ready to wear", "1270": "ralph lauren,spring 2018 menswear", "1271": "ralph lauren,spring 2018 ready to wear", "1272": "ralph lauren,spring 2019 menswear", "1273": "ralph lauren,spring 2019 ready to wear", "1274": "ralph lauren,spring 2020 menswear", "1275": "ralph lauren,spring 2021 ready to wear", "1276": "ralph lauren,spring 2022 ready to wear", "1277": "ralph lauren,spring 2023 ready to wear", "1278": "ralph lauren,spring 2024 menswear", "1279": "ralph lauren,spring 2024 ready to wear", "1280": "saint laurent,fall 2000 ready to wear", "1281": "saint laurent,fall 2001 couture", "1282": "saint laurent,fall 2001 ready to wear", "1283": "saint laurent,fall 2002 ready to wear", "1284": "saint laurent,fall 2003 ready to wear", "1285": "saint laurent,fall 2004 ready to wear", "1286": "saint laurent,fall 2005 menswear", "1287": "saint laurent,fall 2005 ready to wear", "1288": "saint laurent,fall 2006 menswear", "1289": "saint laurent,fall 2006 ready to wear", "1290": "saint laurent,fall 2007 menswear", "1291": "saint laurent,fall 2007 ready to wear", "1292": "saint laurent,fall 2008 menswear", "1293": "saint laurent,fall 2008 ready to wear", "1294": "saint laurent,fall 2009 ready to wear", "1295": "saint laurent,fall 2010 menswear", "1296": "saint laurent,fall 2010 ready to wear", "1297": "saint laurent,fall 2011 menswear", "1298": "saint laurent,fall 2011 ready to wear", "1299": "saint laurent,fall 2012 menswear", "1300": "saint laurent,fall 2012 ready to wear", "1301": "saint laurent,fall 2013 menswear", "1302": "saint laurent,fall 2013 ready to wear", "1303": "saint laurent,fall 2014 menswear", "1304": "saint laurent,fall 2014 ready to wear", "1305": "saint laurent,fall 2015 menswear", "1306": "saint laurent,fall 2015 ready to wear", "1307": "saint laurent,fall 2016 menswear", "1308": "saint laurent,fall 2016 ready to wear", "1309": "saint laurent,fall 2017 ready to wear", "1310": "saint laurent,fall 2018 ready to wear", "1311": "saint laurent,fall 2019 menswear", "1312": "saint laurent,fall 2019 ready to wear", "1313": "saint laurent,fall 2020 ready to wear", "1314": "saint laurent,fall 2021 menswear", "1315": "saint laurent,fall 2021 ready to wear", "1316": "saint laurent,fall 2022 menswear", "1317": "saint laurent,fall 2022 ready to wear", "1318": "saint laurent,fall 2023 menswear", "1319": "saint laurent,fall 2023 ready to wear", "1320": "saint laurent,pre fall 2009", "1321": "saint laurent,pre fall 2010", "1322": "saint laurent,pre fall 2011", "1323": "saint laurent,pre fall 2012", "1324": "saint laurent,pre fall 2013", "1325": "saint laurent,pre fall 2016", "1326": "saint laurent,pre fall 2019", "1327": "saint laurent,pre fall 2020", "1328": "saint laurent,pre fall 2020 menswear", "1329": "saint laurent,pre fall 2021", "1330": "saint laurent,pre fall 2022", "1331": "saint laurent,pre fall 2023", "1332": "saint laurent,resort 2008", "1333": "saint laurent,resort 2010", "1334": "saint laurent,resort 2011", "1335": "saint laurent,resort 2012", "1336": "saint laurent,resort 2014", "1337": "saint laurent,resort 2020", "1338": "saint laurent,resort 2021", "1339": "saint laurent,resort 2022", "1340": "saint laurent,resort 2023", "1341": "saint laurent,spring 2000 ready to wear", "1342": "saint laurent,spring 2001 couture", "1343": "saint laurent,spring 2001 ready to wear", "1344": "saint laurent,spring 2002 couture", "1345": "saint laurent,spring 2002 ready to wear", "1346": "saint laurent,spring 2003 ready to wear", "1347": "saint laurent,spring 2004 ready to wear", "1348": "saint laurent,spring 2005 menswear", "1349": "saint laurent,spring 2005 ready to wear", "1350": "saint laurent,spring 2006 menswear", "1351": "saint laurent,spring 2006 ready to wear", "1352": "saint laurent,spring 2007 menswear", "1353": "saint laurent,spring 2007 ready to wear", "1354": "saint laurent,spring 2008 menswear", "1355": "saint laurent,spring 2008 ready to wear", "1356": "saint laurent,spring 2009 menswear", "1357": "saint laurent,spring 2009 ready to wear", "1358": "saint laurent,spring 2010 ready to wear", "1359": "saint laurent,spring 2011 menswear", "1360": "saint laurent,spring 2011 ready to wear", "1361": "saint laurent,spring 2012 menswear", "1362": "saint laurent,spring 2012 ready to wear", "1363": "saint laurent,spring 2013 ready to wear", "1364": "saint laurent,spring 2014 menswear", "1365": "saint laurent,spring 2014 ready to wear", "1366": "saint laurent,spring 2015 menswear", "1367": "saint laurent,spring 2015 ready to wear", "1368": "saint laurent,spring 2016 menswear", "1369": "saint laurent,spring 2016 ready to wear", "1370": "saint laurent,spring 2017 ready to wear", "1371": "saint laurent,spring 2018 ready to wear", "1372": "saint laurent,spring 2019 menswear", "1373": "saint laurent,spring 2019 ready to wear", "1374": "saint laurent,spring 2020 menswear", "1375": "saint laurent,spring 2020 ready to wear", "1376": "saint laurent,spring 2021 menswear", "1377": "saint laurent,spring 2021 ready to wear", "1378": "saint laurent,spring 2022 menswear", "1379": "saint laurent,spring 2022 ready to wear", "1380": "saint laurent,spring 2023 menswear", "1381": "saint laurent,spring 2023 ready to wear", "1382": "saint laurent,spring 2024 menswear", "1383": "saint laurent,spring 2024 ready to wear", "1384": "valentino,fall 2000 ready to wear", "1385": "valentino,fall 2001 couture", "1386": "valentino,fall 2001 ready to wear", "1387": "valentino,fall 2002 couture", "1388": "valentino,fall 2002 ready to wear", "1389": "valentino,fall 2003 couture", "1390": "valentino,fall 2003 ready to wear", "1391": "valentino,fall 2004 couture", "1392": "valentino,fall 2004 ready to wear", "1393": "valentino,fall 2005 couture", "1394": "valentino,fall 2005 menswear", "1395": "valentino,fall 2005 ready to wear", "1396": "valentino,fall 2006 couture", "1397": "valentino,fall 2006 menswear", "1398": "valentino,fall 2006 ready to wear", "1399": "valentino,fall 2007 couture", "1400": "valentino,fall 2007 menswear", "1401": "valentino,fall 2007 ready to wear", "1402": "valentino,fall 2008 couture", "1403": "valentino,fall 2008 menswear", "1404": "valentino,fall 2008 ready to wear", "1405": "valentino,fall 2009 couture", "1406": "valentino,fall 2009 ready to wear", "1407": "valentino,fall 2010 couture", "1408": "valentino,fall 2010 ready to wear", "1409": "valentino,fall 2011 couture", "1410": "valentino,fall 2011 ready to wear", "1411": "valentino,fall 2012 couture", "1412": "valentino,fall 2012 menswear", "1413": "valentino,fall 2012 ready to wear", "1414": "valentino,fall 2013 couture", "1415": "valentino,fall 2013 menswear", "1416": "valentino,fall 2013 ready to wear", "1417": "valentino,fall 2014 couture", "1418": "valentino,fall 2014 menswear", "1419": "valentino,fall 2014 ready to wear", "1420": "valentino,fall 2015 couture", "1421": "valentino,fall 2015 menswear", "1422": "valentino,fall 2015 ready to wear", "1423": "valentino,fall 2016 couture", "1424": "valentino,fall 2016 menswear", "1425": "valentino,fall 2016 ready to wear", "1426": "valentino,fall 2017 couture", "1427": "valentino,fall 2017 menswear", "1428": "valentino,fall 2017 ready to wear", "1429": "valentino,fall 2018 couture", "1430": "valentino,fall 2018 menswear", "1431": "valentino,fall 2018 ready to wear", "1432": "valentino,fall 2019 couture", "1433": "valentino,fall 2019 menswear", "1434": "valentino,fall 2019 ready to wear", "1435": "valentino,fall 2020 couture", "1436": "valentino,fall 2020 menswear", "1437": "valentino,fall 2020 ready to wear", "1438": "valentino,fall 2021 couture", "1439": "valentino,fall 2021 ready to wear", "1440": "valentino,fall 2022 couture", "1441": "valentino,fall 2022 ready to wear", "1442": "valentino,fall 2023 couture", "1443": "valentino,fall 2023 ready to wear", "1444": "valentino,pre fall 2008", "1445": "valentino,pre fall 2010", "1446": "valentino,pre fall 2011", "1447": "valentino,pre fall 2012", "1448": "valentino,pre fall 2013", "1449": "valentino,pre fall 2014", "1450": "valentino,pre fall 2015", "1451": "valentino,pre fall 2016", "1452": "valentino,pre fall 2017", "1453": "valentino,pre fall 2018", "1454": "valentino,pre fall 2019", "1455": "valentino,pre fall 2020", "1456": "valentino,pre fall 2021", "1457": "valentino,pre fall 2022", "1458": "valentino,pre fall 2023", "1459": "valentino,pre fall 2024", "1460": "valentino,resort 2008", "1461": "valentino,resort 2009", "1462": "valentino,resort 2011", "1463": "valentino,resort 2012", "1464": "valentino,resort 2013", "1465": "valentino,resort 2014", "1466": "valentino,resort 2015", "1467": "valentino,resort 2016", "1468": "valentino,resort 2017", "1469": "valentino,resort 2018", "1470": "valentino,resort 2019", "1471": "valentino,resort 2020", "1472": "valentino,resort 2021", "1473": "valentino,resort 2022", "1474": "valentino,resort 2023", "1475": "valentino,resort 2024", "1476": "valentino,spring 2000 ready to wear", "1477": "valentino,spring 2001 couture", "1478": "valentino,spring 2001 ready to wear", "1479": "valentino,spring 2002 couture", "1480": "valentino,spring 2002 ready to wear", "1481": "valentino,spring 2003 couture", "1482": "valentino,spring 2003 ready to wear", "1483": "valentino,spring 2004 couture", "1484": "valentino,spring 2004 ready to wear", "1485": "valentino,spring 2005 couture", "1486": "valentino,spring 2005 menswear", "1487": "valentino,spring 2005 ready to wear", "1488": "valentino,spring 2006 couture", "1489": "valentino,spring 2006 menswear", "1490": "valentino,spring 2006 ready to wear", "1491": "valentino,spring 2007 couture", "1492": "valentino,spring 2007 menswear", "1493": "valentino,spring 2007 ready to wear", "1494": "valentino,spring 2008 couture", "1495": "valentino,spring 2008 menswear", "1496": "valentino,spring 2008 ready to wear", "1497": "valentino,spring 2009 couture", "1498": "valentino,spring 2009 menswear", "1499": "valentino,spring 2009 ready to wear", "1500": "valentino,spring 2010 couture", "1501": "valentino,spring 2010 ready to wear", "1502": "valentino,spring 2011 couture", "1503": "valentino,spring 2011 ready to wear", "1504": "valentino,spring 2012 couture", "1505": "valentino,spring 2012 menswear", "1506": "valentino,spring 2012 ready to wear", "1507": "valentino,spring 2013 couture", "1508": "valentino,spring 2013 menswear", "1509": "valentino,spring 2013 ready to wear", "1510": "valentino,spring 2014 couture", "1511": "valentino,spring 2014 menswear", "1512": "valentino,spring 2014 ready to wear", "1513": "valentino,spring 2015 couture", "1514": "valentino,spring 2015 menswear", "1515": "valentino,spring 2015 ready to wear", "1516": "valentino,spring 2016 couture", "1517": "valentino,spring 2016 menswear", "1518": "valentino,spring 2016 ready to wear", "1519": "valentino,spring 2017 couture", "1520": "valentino,spring 2017 menswear", "1521": "valentino,spring 2017 ready to wear", "1522": "valentino,spring 2018 couture", "1523": "valentino,spring 2018 menswear", "1524": "valentino,spring 2018 ready to wear", "1525": "valentino,spring 2019 couture", "1526": "valentino,spring 2019 menswear", "1527": "valentino,spring 2019 ready to wear", "1528": "valentino,spring 2020 couture", "1529": "valentino,spring 2020 menswear", "1530": "valentino,spring 2020 ready to wear", "1531": "valentino,spring 2021 couture", "1532": "valentino,spring 2021 menswear", "1533": "valentino,spring 2021 ready to wear", "1534": "valentino,spring 2022 couture", "1535": "valentino,spring 2022 ready to wear", "1536": "valentino,spring 2023 couture", "1537": "valentino,spring 2023 ready to wear", "1538": "valentino,spring 2024 menswear", "1539": "versace by fendi,pre fall 2022", "1540": "versace,fall 1991 ready to wear", "1541": "versace,fall 1992 ready to wear", "1542": "versace,fall 1993 ready to wear", "1543": "versace,fall 1994 ready to wear", "1544": "versace,fall 1995 ready to wear", "1545": "versace,fall 1996 ready to wear", "1546": "versace,fall 1997 ready to wear", "1547": "versace,fall 2000 ready to wear", "1548": "versace,fall 2001 couture", "1549": "versace,fall 2001 ready to wear", "1550": "versace,fall 2002 couture", "1551": "versace,fall 2002 ready to wear", "1552": "versace,fall 2003 couture", "1553": "versace,fall 2003 ready to wear", "1554": "versace,fall 2004 ready to wear", "1555": "versace,fall 2005 menswear", "1556": "versace,fall 2005 ready to wear", "1557": "versace,fall 2006 menswear", "1558": "versace,fall 2006 ready to wear", "1559": "versace,fall 2007 menswear", "1560": "versace,fall 2007 ready to wear", "1561": "versace,fall 2008 menswear", "1562": "versace,fall 2008 ready to wear", "1563": "versace,fall 2009 ready to wear", "1564": "versace,fall 2010 menswear", "1565": "versace,fall 2010 ready to wear", "1566": "versace,fall 2011 menswear", "1567": "versace,fall 2011 ready to wear", "1568": "versace,fall 2012 menswear", "1569": "versace,fall 2012 ready to wear", "1570": "versace,fall 2013 menswear", "1571": "versace,fall 2013 ready to wear", "1572": "versace,fall 2014 menswear", "1573": "versace,fall 2014 ready to wear", "1574": "versace,fall 2015 menswear", "1575": "versace,fall 2015 ready to wear", "1576": "versace,fall 2016 menswear", "1577": "versace,fall 2016 ready to wear", "1578": "versace,fall 2017 menswear", "1579": "versace,fall 2017 ready to wear", "1580": "versace,fall 2018 menswear", "1581": "versace,fall 2018 ready to wear", "1582": "versace,fall 2019 menswear", "1583": "versace,fall 2019 ready to wear", "1584": "versace,fall 2020 menswear", "1585": "versace,fall 2020 ready to wear", "1586": "versace,fall 2021 ready to wear", "1587": "versace,fall 2022 menswear", "1588": "versace,fall 2022 ready to wear", "1589": "versace,fall 2023 ready to wear", "1590": "versace,pre fall 2008", "1591": "versace,pre fall 2009", "1592": "versace,pre fall 2010", "1593": "versace,pre fall 2011", "1594": "versace,pre fall 2012", "1595": "versace,pre fall 2013", "1596": "versace,pre fall 2014", "1597": "versace,pre fall 2015", "1598": "versace,pre fall 2016", "1599": "versace,pre fall 2017", "1600": "versace,pre fall 2018", "1601": "versace,pre fall 2019", "1602": "versace,pre fall 2020", "1603": "versace,pre fall 2021", "1604": "versace,pre fall 2022", "1605": "versace,pre fall 2022 menswear", "1606": "versace,pre fall 2023", "1607": "versace,resort 2008", "1608": "versace,resort 2009", "1609": "versace,resort 2010", "1610": "versace,resort 2011", "1611": "versace,resort 2012", "1612": "versace,resort 2013", "1613": "versace,resort 2014", "1614": "versace,resort 2015", "1615": "versace,resort 2016", "1616": "versace,resort 2017", "1617": "versace,resort 2018", "1618": "versace,resort 2019", "1619": "versace,resort 2020", "1620": "versace,resort 2021", "1621": "versace,resort 2022", "1622": "versace,resort 2023", "1623": "versace,spring 1991 ready to wear", "1624": "versace,spring 1992 ready to wear", "1625": "versace,spring 1993 ready to wear", "1626": "versace,spring 1994 ready to wear", "1627": "versace,spring 1995 ready to wear", "1628": "versace,spring 1996 ready to wear", "1629": "versace,spring 1997 ready to wear", "1630": "versace,spring 2000 ready to wear", "1631": "versace,spring 2001 couture", "1632": "versace,spring 2001 ready to wear", "1633": "versace,spring 2002 couture", "1634": "versace,spring 2002 ready to wear", "1635": "versace,spring 2003 couture", "1636": "versace,spring 2003 ready to wear", "1637": "versace,spring 2004 couture", "1638": "versace,spring 2004 ready to wear", "1639": "versace,spring 2005 menswear", "1640": "versace,spring 2005 ready to wear", "1641": "versace,spring 2006 menswear", "1642": "versace,spring 2006 ready to wear", "1643": "versace,spring 2007 menswear", "1644": "versace,spring 2007 ready to wear", "1645": "versace,spring 2008 couture", "1646": "versace,spring 2008 menswear", "1647": "versace,spring 2008 ready to wear", "1648": "versace,spring 2009 menswear", "1649": "versace,spring 2009 ready to wear", "1650": "versace,spring 2010 ready to wear", "1651": "versace,spring 2011 menswear", "1652": "versace,spring 2011 ready to wear", "1653": "versace,spring 2012 menswear", "1654": "versace,spring 2012 ready to wear", "1655": "versace,spring 2013 menswear", "1656": "versace,spring 2013 ready to wear", "1657": "versace,spring 2014 menswear", "1658": "versace,spring 2014 ready to wear", "1659": "versace,spring 2015 menswear", "1660": "versace,spring 2015 ready to wear", "1661": "versace,spring 2016 menswear", "1662": "versace,spring 2016 ready to wear", "1663": "versace,spring 2017 menswear", "1664": "versace,spring 2017 ready to wear", "1665": "versace,spring 2018 menswear", "1666": "versace,spring 2018 ready to wear", "1667": "versace,spring 2019 menswear", "1668": "versace,spring 2019 ready to wear", "1669": "versace,spring 2020 menswear", "1670": "versace,spring 2020 ready to wear", "1671": "versace,spring 2021 menswear", "1672": "versace,spring 2021 ready to wear", "1673": "versace,spring 2022 ready to wear", "1674": "versace,spring 2023 menswear", "1675": "versace,spring 2023 ready to wear", "1676": "versace,spring 2024 ready to wear"}}}}, {"name": "embeddings", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 1544984763.625, "num_examples": 87547}], "download_size": 1544250543, "dataset_size": 1544984763.625}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-29T16:57:26+00:00 | [] | [] | TAGS
#region-us
| # vogue-runway-top15-512px-nobg-embeddings
Vogue Runway
- 15 fashion houses
- 1679 collections
- 87,547 images
Fashion Houses: Alexander McQueen, Armani, Balenciaga, Calvin Klein, Chanel, Dior, Fendi, Gucci, Hermes, Louis Vuitton, Prada, Ralph Lauren, Saint Laurent, Valentino, Versace.
Images are maximum height 512 pixels.
Background is removed using mattmdjaga/segformer_b2_clothes.
Embeddings generated with tonyassi/vogue-fashion-collection-15-nobg. | [
"# vogue-runway-top15-512px-nobg-embeddings\n\n Vogue Runway\n- 15 fashion houses\n- 1679 collections\n- 87,547 images\n\nFashion Houses: Alexander McQueen, Armani, Balenciaga, Calvin Klein, Chanel, Dior, Fendi, Gucci, Hermes, Louis Vuitton, Prada, Ralph Lauren, Saint Laurent, Valentino, Versace.\n\nImages are maximum height 512 pixels.\n\nBackground is removed using mattmdjaga/segformer_b2_clothes.\n\nEmbeddings generated with tonyassi/vogue-fashion-collection-15-nobg."
] | [
"TAGS\n#region-us \n",
"# vogue-runway-top15-512px-nobg-embeddings\n\n Vogue Runway\n- 15 fashion houses\n- 1679 collections\n- 87,547 images\n\nFashion Houses: Alexander McQueen, Armani, Balenciaga, Calvin Klein, Chanel, Dior, Fendi, Gucci, Hermes, Louis Vuitton, Prada, Ralph Lauren, Saint Laurent, Valentino, Versace.\n\nImages are maximum height 512 pixels.\n\nBackground is removed using mattmdjaga/segformer_b2_clothes.\n\nEmbeddings generated with tonyassi/vogue-fashion-collection-15-nobg."
] |
033413a757bd11e00748c5882827f19543261634 | # Dataset Card for "llamaindex_stack"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mhammadkhan/llamaindex_stack | [
"region:us"
] | 2024-01-23T23:00:24+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "metadata", "struct": [{"name": "file_path", "dtype": "string"}, {"name": "repo_id", "dtype": "string"}, {"name": "token_count", "dtype": "int64"}]}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 18683868, "num_examples": 4163}], "download_size": 0, "dataset_size": 18683868}} | 2024-01-23T23:01:41+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "llamaindex_stack"
More Information needed | [
"# Dataset Card for \"llamaindex_stack\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"llamaindex_stack\"\n\nMore Information needed"
] |
246b3e29ed9eb760990f172cc60cb4e19bc4792f | # lilac/TruthfulQA-MultipleChoice
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/truthful_qa](https://huggingface.co/datasets/truthful_qa)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-TruthfulQA-MultipleChoice
```
or from python with:
```py
ll.download("lilacai/lilac-TruthfulQA-MultipleChoice")
```
| lilacai/lilac-TruthfulQA-MultipleChoice | [
"Lilac",
"region:us"
] | 2024-01-23T23:00:35+00:00 | {"tags": ["Lilac"]} | 2024-01-23T23:00:37+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/TruthfulQA-MultipleChoice
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/TruthfulQA-MultipleChoice\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/TruthfulQA-MultipleChoice\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
80e602b34b2c9b938cd10697822f7aca84db8a05 | # lilac/GSM8K-main
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/gsm8k](https://huggingface.co/datasets/gsm8k)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-GSM8K-main
```
or from python with:
```py
ll.download("lilacai/lilac-GSM8K-main")
```
| lilacai/lilac-GSM8K-main | [
"Lilac",
"region:us"
] | 2024-01-23T23:10:54+00:00 | {"tags": ["Lilac"]} | 2024-01-23T23:10:56+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/GSM8K-main
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/GSM8K-main\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/GSM8K-main\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
cb80eea96da61d90d7345e231fdb605dde234dbd |
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.14
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.14](https://huggingface.co/andysalerno/openchat-nectar-0.14) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T23:09:38.113022](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14/blob/main/results_2024-01-23T23-09-38.113022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6488818534853035,
"acc_stderr": 0.03208696726323281,
"acc_norm": 0.6491291937563749,
"acc_norm_stderr": 0.03275329412199132,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.500920468272256,
"mc2_stderr": 0.015353134413860351
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156218
},
"harness|hellaswag|10": {
"acc": 0.6361282613025294,
"acc_stderr": 0.004801290954387085,
"acc_norm": 0.8302131049591714,
"acc_norm_stderr": 0.0037467817125096527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977109,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977109
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638629
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276875,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503228,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503228
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406978,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406978
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20670391061452514,
"acc_stderr": 0.013543260867834455,
"acc_norm": 0.20670391061452514,
"acc_norm_stderr": 0.013543260867834455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4817470664928292,
"acc_stderr": 0.012761723960595472,
"acc_norm": 0.4817470664928292,
"acc_norm_stderr": 0.012761723960595472
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.027365861131513812,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.027365861131513812
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.500920468272256,
"mc2_stderr": 0.015353134413860351
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923644
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14 | [
"region:us"
] | 2024-01-23T23:11:56+00:00 | {"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.14", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.14](https://huggingface.co/andysalerno/openchat-nectar-0.14) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T23:09:38.113022](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14/blob/main/results_2024-01-23T23-09-38.113022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6488818534853035,\n \"acc_stderr\": 0.03208696726323281,\n \"acc_norm\": 0.6491291937563749,\n \"acc_norm_stderr\": 0.03275329412199132,\n \"mc1\": 0.3317013463892289,\n \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.500920468272256,\n \"mc2_stderr\": 0.015353134413860351\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156218\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6361282613025294,\n \"acc_stderr\": 0.004801290954387085,\n \"acc_norm\": 0.8302131049591714,\n \"acc_norm_stderr\": 0.0037467817125096527\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977109,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977109\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638629,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276875,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503228,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503228\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406978,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406978\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20670391061452514,\n \"acc_stderr\": 0.013543260867834455,\n \"acc_norm\": 0.20670391061452514,\n \"acc_norm_stderr\": 0.013543260867834455\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4817470664928292,\n \"acc_stderr\": 0.012761723960595472,\n \"acc_norm\": 0.4817470664928292,\n \"acc_norm_stderr\": 0.012761723960595472\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.027365861131513812,\n \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.027365861131513812\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.500920468272256,\n \"mc2_stderr\": 0.015353134413860351\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \"acc_stderr\": 0.012714401009923644\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.14", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["**/details_harness|winogrande|5_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T23-09-38.113022.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T23_09_38.113022", "path": ["results_2024-01-23T23-09-38.113022.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T23-09-38.113022.parquet"]}]}]} | 2024-01-23T23:12:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.14
Dataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.14 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T23:09:38.113022(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.14\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.14 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:09:38.113022(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.14\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/openchat-nectar-0.14 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:09:38.113022(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
02145d02c366a52a4e35f40ba95129edcb7d31b8 | # lilac/GSM8K-socratic
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/gsm8k](https://huggingface.co/datasets/gsm8k)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-GSM8K-socratic
```
or from python with:
```py
ll.download("lilacai/lilac-GSM8K-socratic")
```
| lilacai/lilac-GSM8K-socratic | [
"Lilac",
"region:us"
] | 2024-01-23T23:19:35+00:00 | {"tags": ["Lilac"]} | 2024-01-23T23:22:52+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/GSM8K-socratic
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/GSM8K-socratic\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/GSM8K-socratic\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
8bb0f407d1740a194c6c7659bd0a99dd0e771501 | # lilac/WinoGrande
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/winogrande](https://huggingface.co/datasets/winogrande)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-WinoGrande
```
or from python with:
```py
ll.download("lilacai/lilac-WinoGrande")
```
| lilacai/lilac-WinoGrande | [
"Lilac",
"region:us"
] | 2024-01-23T23:28:34+00:00 | {"tags": ["Lilac"]} | 2024-01-23T23:28:38+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/WinoGrande
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/WinoGrande\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/WinoGrande\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
d7c943c1df91f401ff828962b93e4687c0ca353e |
# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant1-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/WildMarcoroni-Variant1-7B](https://huggingface.co/BarryFutureman/WildMarcoroni-Variant1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant1-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T23:37:24.789593](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant1-7B/blob/main/results_2024-01-23T23-37-24.789593.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6544725411330661,
"acc_stderr": 0.03209534606277316,
"acc_norm": 0.6537231161998335,
"acc_norm_stderr": 0.032767252968853494,
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.6975837745369705,
"mc2_stderr": 0.015108261944159049
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520769,
"acc_norm": 0.7397610921501706,
"acc_norm_stderr": 0.012821930225112571
},
"harness|hellaswag|10": {
"acc": 0.7233618801035651,
"acc_stderr": 0.004464217420693355,
"acc_norm": 0.8860784704242183,
"acc_norm_stderr": 0.0031706661225176552
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083131,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.6975837745369705,
"mc2_stderr": 0.015108261944159049
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624179
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant1-7B | [
"region:us"
] | 2024-01-23T23:39:44+00:00 | {"pretty_name": "Evaluation run of BarryFutureman/WildMarcoroni-Variant1-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarryFutureman/WildMarcoroni-Variant1-7B](https://huggingface.co/BarryFutureman/WildMarcoroni-Variant1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant1-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T23:37:24.789593](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant1-7B/blob/main/results_2024-01-23T23-37-24.789593.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6544725411330661,\n \"acc_stderr\": 0.03209534606277316,\n \"acc_norm\": 0.6537231161998335,\n \"acc_norm_stderr\": 0.032767252968853494,\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.6975837745369705,\n \"mc2_stderr\": 0.015108261944159049\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520769,\n \"acc_norm\": 0.7397610921501706,\n \"acc_norm_stderr\": 0.012821930225112571\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7233618801035651,\n \"acc_stderr\": 0.004464217420693355,\n \"acc_norm\": 0.8860784704242183,\n \"acc_norm_stderr\": 0.0031706661225176552\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083131,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083131\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.6975837745369705,\n \"mc2_stderr\": 0.015108261944159049\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \"acc_stderr\": 0.012588685966624179\n }\n}\n```", "repo_url": "https://huggingface.co/BarryFutureman/WildMarcoroni-Variant1-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-37-24.789593.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["**/details_harness|winogrande|5_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T23-37-24.789593.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T23_37_24.789593", "path": ["results_2024-01-23T23-37-24.789593.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T23-37-24.789593.parquet"]}]}]} | 2024-01-23T23:40:06+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant1-7B
Dataset automatically created during the evaluation run of model BarryFutureman/WildMarcoroni-Variant1-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T23:37:24.789593(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant1-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WildMarcoroni-Variant1-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:37:24.789593(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant1-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WildMarcoroni-Variant1-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:37:24.789593(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7ad6ccd9b7458e2dbb34cca4342c1b21395110fa |
# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant3-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/WildMarcoroni-Variant3-7B](https://huggingface.co/BarryFutureman/WildMarcoroni-Variant3-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant3-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T23:43:24.021355](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant3-7B/blob/main/results_2024-01-23T23-43-24.021355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499314841990914,
"acc_stderr": 0.03221584802268225,
"acc_norm": 0.6493249535584479,
"acc_norm_stderr": 0.03288922243087539,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963915,
"mc2": 0.7168289853433252,
"mc2_stderr": 0.01501857623277707
},
"harness|arc:challenge|25": {
"acc": 0.7064846416382252,
"acc_stderr": 0.013307250444941118,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.7350129456283608,
"acc_stderr": 0.004404243675486008,
"acc_norm": 0.8895638319059949,
"acc_norm_stderr": 0.003127920738394111
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382186,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135367,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135367
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.016574027219517635,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.016574027219517635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963915,
"mc2": 0.7168289853433252,
"mc2_stderr": 0.01501857623277707
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433535
},
"harness|gsm8k|5": {
"acc": 0.6611068991660348,
"acc_stderr": 0.013037955768562499
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant3-7B | [
"region:us"
] | 2024-01-23T23:45:43+00:00 | {"pretty_name": "Evaluation run of BarryFutureman/WildMarcoroni-Variant3-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarryFutureman/WildMarcoroni-Variant3-7B](https://huggingface.co/BarryFutureman/WildMarcoroni-Variant3-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant3-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T23:43:24.021355](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__WildMarcoroni-Variant3-7B/blob/main/results_2024-01-23T23-43-24.021355.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499314841990914,\n \"acc_stderr\": 0.03221584802268225,\n \"acc_norm\": 0.6493249535584479,\n \"acc_norm_stderr\": 0.03288922243087539,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963915,\n \"mc2\": 0.7168289853433252,\n \"mc2_stderr\": 0.01501857623277707\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7064846416382252,\n \"acc_stderr\": 0.013307250444941118,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7350129456283608,\n \"acc_stderr\": 0.004404243675486008,\n \"acc_norm\": 0.8895638319059949,\n \"acc_norm_stderr\": 0.003127920738394111\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382186,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382186\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135367,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135367\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963915,\n \"mc2\": 0.7168289853433252,\n \"mc2_stderr\": 0.01501857623277707\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433535\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6611068991660348,\n \"acc_stderr\": 0.013037955768562499\n }\n}\n```", "repo_url": "https://huggingface.co/BarryFutureman/WildMarcoroni-Variant3-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-43-24.021355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["**/details_harness|winogrande|5_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T23-43-24.021355.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T23_43_24.021355", "path": ["results_2024-01-23T23-43-24.021355.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T23-43-24.021355.parquet"]}]}]} | 2024-01-23T23:46:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant3-7B
Dataset automatically created during the evaluation run of model BarryFutureman/WildMarcoroni-Variant3-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T23:43:24.021355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant3-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WildMarcoroni-Variant3-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:43:24.021355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarryFutureman/WildMarcoroni-Variant3-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/WildMarcoroni-Variant3-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:43:24.021355(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
18b5e1636a501dc3f467d51c118a01b7ff737238 |
# Dataset Card for Evaluation run of Weyaxi/Einstein-openchat-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-openchat-7B](https://huggingface.co/Weyaxi/Einstein-openchat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Einstein-openchat-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T23:44:02.231759](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-openchat-7B/blob/main/results_2024-01-23T23-44-02.231759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6416289994817496,
"acc_stderr": 0.03222660852943186,
"acc_norm": 0.6436221960309779,
"acc_norm_stderr": 0.03287361714145995,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5450984783622247,
"mc2_stderr": 0.0153785526545658
},
"harness|arc:challenge|25": {
"acc": 0.606655290102389,
"acc_stderr": 0.014275101465693028,
"acc_norm": 0.6510238907849829,
"acc_norm_stderr": 0.0139289334613825
},
"harness|hellaswag|10": {
"acc": 0.6452897829117705,
"acc_stderr": 0.004774476498238618,
"acc_norm": 0.8356901015733917,
"acc_norm_stderr": 0.0036979923561244747
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.02479011845933221,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.02479011845933221
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.02428314052946731,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.02428314052946731
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083018,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078955,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078955
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973148,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973148
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208195,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208195
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863935,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863935
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144717,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144717
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687492,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687492
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712845,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5450984783622247,
"mc2_stderr": 0.0153785526545658
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987743
},
"harness|gsm8k|5": {
"acc": 0.6087945413191812,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Einstein-openchat-7B | [
"region:us"
] | 2024-01-23T23:46:21+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Einstein-openchat-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Einstein-openchat-7B](https://huggingface.co/Weyaxi/Einstein-openchat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Einstein-openchat-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-23T23:44:02.231759](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Einstein-openchat-7B/blob/main/results_2024-01-23T23-44-02.231759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6416289994817496,\n \"acc_stderr\": 0.03222660852943186,\n \"acc_norm\": 0.6436221960309779,\n \"acc_norm_stderr\": 0.03287361714145995,\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5450984783622247,\n \"mc2_stderr\": 0.0153785526545658\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.606655290102389,\n \"acc_stderr\": 0.014275101465693028,\n \"acc_norm\": 0.6510238907849829,\n \"acc_norm_stderr\": 0.0139289334613825\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6452897829117705,\n \"acc_stderr\": 0.004774476498238618,\n \"acc_norm\": 0.8356901015733917,\n \"acc_norm_stderr\": 0.0036979923561244747\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.02479011845933221,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.02479011845933221\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.02428314052946731,\n \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.02428314052946731\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078955,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078955\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973148,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973148\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n \"acc_stderr\": 0.015251931579208195,\n \"acc_norm\": 0.29497206703910617,\n \"acc_norm_stderr\": 0.015251931579208195\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n \"acc_stderr\": 0.012755368722863935,\n \"acc_norm\": 0.4758800521512386,\n \"acc_norm_stderr\": 0.012755368722863935\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712845,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712845\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5450984783622247,\n \"mc2_stderr\": 0.0153785526545658\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987743\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6087945413191812,\n \"acc_stderr\": 0.013442502402794302\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Einstein-openchat-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-23T23-44-02.231759.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["**/details_harness|winogrande|5_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-23T23-44-02.231759.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_23T23_44_02.231759", "path": ["results_2024-01-23T23-44-02.231759.parquet"]}, {"split": "latest", "path": ["results_2024-01-23T23-44-02.231759.parquet"]}]}]} | 2024-01-23T23:46:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Einstein-openchat-7B
Dataset automatically created during the evaluation run of model Weyaxi/Einstein-openchat-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-23T23:44:02.231759(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Einstein-openchat-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Einstein-openchat-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:44:02.231759(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Einstein-openchat-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Einstein-openchat-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-23T23:44:02.231759(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c6a52ce035d09052616cad6ed014b9696bf454c8 |
# Android GPU Performance Counter to Key Press Dataset
## Description
This dataset comes from our mobile GPU-based eavesdropping work, [Eavesdropping user credentials via GPU side channels on smartphones](https://doi.org/10.1145/3503222.3507757), presented at the 27th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2022).
It contains 3,466 traces of mapping between the on-screen keyboard key presses and corresponding Snapdragon Adreno GPU performance counter changes collected on device in the meantime.
## Data Structure
The dataset is arranged in the following format:
* Folder name (e.g., `1622457056`): This UNIX timestamp when the experiment took place.
* `timestamp_data.csv`: Raw recording of GPU performance counter changes during the experiment.
* Column 1: UNIX timestamp of each performance counter ("PC") value change event, with granularity of 1 microseconds.
* Column 2-13: GPU PC value changes of different types:
* `PERF_LRZ_VISIBLE_PRIM_AFTER_LRZ`
* `PERF_LRZ_FULL_8X8_TILES`
* `PERF_LRZ_PARTIAL_8X8_TILES`
* `PERF_LRZ_VISIBLE_PIXEL_AFTER_LRZ`
* `PERF_RAS_SUPERTILE_ACTIVE_CYCLES`
* `PERF_RAS_SUPER_TILES`
* `PERF_RAS_8X4_TILES`
* `PERF_RAS_FULLY_COVERED_8X4_TILES`
* `PERF_VPC_PC_PRIMITIVES`
* `PERF_VPC_SP_COMPONENTS`
* `PERF_VPC_LRZ_ASSIGN_PRIMITIVES`
* `PERF_VPC_SP_LM_COMPONENTS`
* `timestamp_keys.csv`: Keyboard key presses occurred during the experiment.
* Column 1: UNIX timestamp of each key press, with granularity of 1 microseconds.
* Column 2: The specific key press occurred.
For the discussion of detailed meanings of different GPU PCs, please refer to Section 4 of [our paper](https://doi.org/10.1145/3503222.3507757).
## Citation
If you find this dataset useful, please consider citing the original published paper as shown below:
```
@inproceedings{yang2022eavesdropping,
author = {Yang, Boyuan and Chen, Ruirong and Huang, Kai and Yang, Jun and Gao, Wei},
title = {Eavesdropping user credentials via GPU side channels on smartphones},
year = {2022},
isbn = {9781450392051},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3503222.3507757},
doi = {10.1145/3503222.3507757},
booktitle = {Proceedings of the 27th ACM International Conference on Architectural Support for Programming Languages and Operating Systems},
pages = {285–299},
numpages = {15},
keywords = {Smartphones, Side Channel, Performance Counters, Mobile GPU, Input Eavesdropping},
location = {Lausanne, Switzerland},
series = {ASPLOS '22}
}
```
## License
[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg | hosiet/android-perfcounter-to-key-press | [
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-01-23T23:55:46+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["1K<n<10K"], "pretty_name": "Android GPU Performance Counter to Key Press Dataset"} | 2024-01-29T15:06:14+00:00 | [] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #region-us
|
# Android GPU Performance Counter to Key Press Dataset
## Description
This dataset comes from our mobile GPU-based eavesdropping work, Eavesdropping user credentials via GPU side channels on smartphones, presented at the 27th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2022).
It contains 3,466 traces of mapping between the on-screen keyboard key presses and corresponding Snapdragon Adreno GPU performance counter changes collected on device in the meantime.
## Data Structure
The dataset is arranged in the following format:
* Folder name (e.g., '1622457056'): This UNIX timestamp when the experiment took place.
* 'timestamp_data.csv': Raw recording of GPU performance counter changes during the experiment.
* Column 1: UNIX timestamp of each performance counter ("PC") value change event, with granularity of 1 microseconds.
* Column 2-13: GPU PC value changes of different types:
* 'PERF_LRZ_VISIBLE_PRIM_AFTER_LRZ'
* 'PERF_LRZ_FULL_8X8_TILES'
* 'PERF_LRZ_PARTIAL_8X8_TILES'
* 'PERF_LRZ_VISIBLE_PIXEL_AFTER_LRZ'
* 'PERF_RAS_SUPERTILE_ACTIVE_CYCLES'
* 'PERF_RAS_SUPER_TILES'
* 'PERF_RAS_8X4_TILES'
* 'PERF_RAS_FULLY_COVERED_8X4_TILES'
* 'PERF_VPC_PC_PRIMITIVES'
* 'PERF_VPC_SP_COMPONENTS'
* 'PERF_VPC_LRZ_ASSIGN_PRIMITIVES'
* 'PERF_VPC_SP_LM_COMPONENTS'
* 'timestamp_keys.csv': Keyboard key presses occurred during the experiment.
* Column 1: UNIX timestamp of each key press, with granularity of 1 microseconds.
* Column 2: The specific key press occurred.
For the discussion of detailed meanings of different GPU PCs, please refer to Section 4 of our paper.
If you find this dataset useful, please consider citing the original published paper as shown below:
## License
[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: URL
[cc-by-nc-sa-image]: URL
[cc-by-nc-sa-shield]: URL | [
"# Android GPU Performance Counter to Key Press Dataset",
"## Description\n\nThis dataset comes from our mobile GPU-based eavesdropping work, Eavesdropping user credentials via GPU side channels on smartphones, presented at the 27th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2022).\nIt contains 3,466 traces of mapping between the on-screen keyboard key presses and corresponding Snapdragon Adreno GPU performance counter changes collected on device in the meantime.",
"## Data Structure\n\nThe dataset is arranged in the following format:\n\n* Folder name (e.g., '1622457056'): This UNIX timestamp when the experiment took place.\n * 'timestamp_data.csv': Raw recording of GPU performance counter changes during the experiment.\n * Column 1: UNIX timestamp of each performance counter (\"PC\") value change event, with granularity of 1 microseconds.\n * Column 2-13: GPU PC value changes of different types:\n * 'PERF_LRZ_VISIBLE_PRIM_AFTER_LRZ'\n * 'PERF_LRZ_FULL_8X8_TILES'\n * 'PERF_LRZ_PARTIAL_8X8_TILES'\n * 'PERF_LRZ_VISIBLE_PIXEL_AFTER_LRZ'\n * 'PERF_RAS_SUPERTILE_ACTIVE_CYCLES'\n * 'PERF_RAS_SUPER_TILES'\n * 'PERF_RAS_8X4_TILES'\n * 'PERF_RAS_FULLY_COVERED_8X4_TILES'\n * 'PERF_VPC_PC_PRIMITIVES'\n * 'PERF_VPC_SP_COMPONENTS'\n * 'PERF_VPC_LRZ_ASSIGN_PRIMITIVES'\n * 'PERF_VPC_SP_LM_COMPONENTS'\n * 'timestamp_keys.csv': Keyboard key presses occurred during the experiment.\n * Column 1: UNIX timestamp of each key press, with granularity of 1 microseconds.\n * Column 2: The specific key press occurred.\n\nFor the discussion of detailed meanings of different GPU PCs, please refer to Section 4 of our paper.\n \nIf you find this dataset useful, please consider citing the original published paper as shown below:",
"## License\n\n[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]\n\nThis work is licensed under a\n[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].\n\n[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]\n\n[cc-by-nc-sa]: URL\n[cc-by-nc-sa-image]: URL\n[cc-by-nc-sa-shield]: URL"
] | [
"TAGS\n#size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #region-us \n",
"# Android GPU Performance Counter to Key Press Dataset",
"## Description\n\nThis dataset comes from our mobile GPU-based eavesdropping work, Eavesdropping user credentials via GPU side channels on smartphones, presented at the 27th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2022).\nIt contains 3,466 traces of mapping between the on-screen keyboard key presses and corresponding Snapdragon Adreno GPU performance counter changes collected on device in the meantime.",
"## Data Structure\n\nThe dataset is arranged in the following format:\n\n* Folder name (e.g., '1622457056'): This UNIX timestamp when the experiment took place.\n * 'timestamp_data.csv': Raw recording of GPU performance counter changes during the experiment.\n * Column 1: UNIX timestamp of each performance counter (\"PC\") value change event, with granularity of 1 microseconds.\n * Column 2-13: GPU PC value changes of different types:\n * 'PERF_LRZ_VISIBLE_PRIM_AFTER_LRZ'\n * 'PERF_LRZ_FULL_8X8_TILES'\n * 'PERF_LRZ_PARTIAL_8X8_TILES'\n * 'PERF_LRZ_VISIBLE_PIXEL_AFTER_LRZ'\n * 'PERF_RAS_SUPERTILE_ACTIVE_CYCLES'\n * 'PERF_RAS_SUPER_TILES'\n * 'PERF_RAS_8X4_TILES'\n * 'PERF_RAS_FULLY_COVERED_8X4_TILES'\n * 'PERF_VPC_PC_PRIMITIVES'\n * 'PERF_VPC_SP_COMPONENTS'\n * 'PERF_VPC_LRZ_ASSIGN_PRIMITIVES'\n * 'PERF_VPC_SP_LM_COMPONENTS'\n * 'timestamp_keys.csv': Keyboard key presses occurred during the experiment.\n * Column 1: UNIX timestamp of each key press, with granularity of 1 microseconds.\n * Column 2: The specific key press occurred.\n\nFor the discussion of detailed meanings of different GPU PCs, please refer to Section 4 of our paper.\n \nIf you find this dataset useful, please consider citing the original published paper as shown below:",
"## License\n\n[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]\n\nThis work is licensed under a\n[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].\n\n[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]\n\n[cc-by-nc-sa]: URL\n[cc-by-nc-sa-image]: URL\n[cc-by-nc-sa-shield]: URL"
] |
b6d5ff29fcf84feb9606e41a22b83543ab07dbef |
# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/MBX-7B-v3](https://huggingface.co/flemmingmiguel/MBX-7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-29T00:10:52.670556](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3/blob/main/results_2024-01-29T00-10-52.670556.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6568226997496719,
"acc_stderr": 0.03194203481467334,
"acc_norm": 0.6561689082057574,
"acc_norm_stderr": 0.03261255172047086,
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7186768399576933,
"mc2_stderr": 0.014757394057634371
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.01328452529240351,
"acc_norm": 0.7414675767918089,
"acc_norm_stderr": 0.012794553754288694
},
"harness|hellaswag|10": {
"acc": 0.7161919936267676,
"acc_stderr": 0.004499233874427508,
"acc_norm": 0.8890659231228839,
"acc_norm_stderr": 0.003134086549952684
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049942,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049942
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7186768399576933,
"mc2_stderr": 0.014757394057634371
},
"harness|winogrande|5": {
"acc": 0.8555643251775849,
"acc_stderr": 0.009879767358079229
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3 | [
"region:us"
] | 2024-01-24T00:14:15+00:00 | {"pretty_name": "Evaluation run of flemmingmiguel/MBX-7B-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/MBX-7B-v3](https://huggingface.co/flemmingmiguel/MBX-7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-29T00:10:52.670556](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3/blob/main/results_2024-01-29T00-10-52.670556.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6568226997496719,\n \"acc_stderr\": 0.03194203481467334,\n \"acc_norm\": 0.6561689082057574,\n \"acc_norm_stderr\": 0.03261255172047086,\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7186768399576933,\n \"mc2_stderr\": 0.014757394057634371\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.01328452529240351,\n \"acc_norm\": 0.7414675767918089,\n \"acc_norm_stderr\": 0.012794553754288694\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7161919936267676,\n \"acc_stderr\": 0.004499233874427508,\n \"acc_norm\": 0.8890659231228839,\n \"acc_norm_stderr\": 0.003134086549952684\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.016602564615049942,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.016602564615049942\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7186768399576933,\n \"mc2_stderr\": 0.014757394057634371\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8555643251775849,\n \"acc_stderr\": 0.009879767358079229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \"acc_stderr\": 0.012588685966624184\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/MBX-7B-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|arc:challenge|25_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|arc:challenge|25_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|arc:challenge|25_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|arc:challenge|25_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|gsm8k|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|gsm8k|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|gsm8k|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|gsm8k|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hellaswag|10_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hellaswag|10_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hellaswag|10_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hellaswag|10_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T00-11-56.066743.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-25T06-10-31.066280.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-26T02-32-50.245516.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["**/details_harness|winogrande|5_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["**/details_harness|winogrande|5_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["**/details_harness|winogrande|5_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["**/details_harness|winogrande|5_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-29T00-10-52.670556.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T00_11_56.066743", "path": ["results_2024-01-24T00-11-56.066743.parquet"]}, {"split": "2024_01_25T06_10_31.066280", "path": ["results_2024-01-25T06-10-31.066280.parquet"]}, {"split": "2024_01_26T02_32_50.245516", "path": ["results_2024-01-26T02-32-50.245516.parquet"]}, {"split": "2024_01_29T00_10_52.670556", "path": ["results_2024-01-29T00-10-52.670556.parquet"]}, {"split": "latest", "path": ["results_2024-01-29T00-10-52.670556.parquet"]}]}]} | 2024-01-29T00:13:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B-v3
Dataset automatically created during the evaluation run of model flemmingmiguel/MBX-7B-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-29T00:10:52.670556(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B-v3\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/MBX-7B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-29T00:10:52.670556(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of flemmingmiguel/MBX-7B-v3\n\n\n\nDataset automatically created during the evaluation run of model flemmingmiguel/MBX-7B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-29T00:10:52.670556(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
369702693d0e51d57e247ff5ff26ca94264e9869 | # lilac/SlimOrca
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/Open-Orca/SlimOrca](https://huggingface.co/datasets/Open-Orca/SlimOrca)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-SlimOrca
```
or from python with:
```py
ll.download("lilacai/lilac-SlimOrca")
```
| lilacai/lilac-SlimOrca | [
"Lilac",
"region:us"
] | 2024-01-24T00:26:54+00:00 | {"tags": ["Lilac"]} | 2024-01-26T14:57:34+00:00 | [] | [] | TAGS
#Lilac #region-us
| # lilac/SlimOrca
This dataset is a Lilac processed dataset. Original dataset: URL
To download the dataset to a local directory:
or from python with:
| [
"# lilac/SlimOrca\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] | [
"TAGS\n#Lilac #region-us \n",
"# lilac/SlimOrca\nThis dataset is a Lilac processed dataset. Original dataset: URL\n\nTo download the dataset to a local directory:\n\n\n\nor from python with:"
] |
e86e5451a1fce4731fe07c710649a229d92e5cab | # FinEntity: A Dataset for entity-level sentiment classification.
In this work, we introduce an entity-level sentiment classification dataset, called **FinEntity**, that annotates sentiment (positive, neutral, and negative) of individual financial entities in financial news. The dataset construction process is well-documented in the paper.
* Paper: [FinEntity: Entity-level Sentiment Classification for Financial Texts](https://aclanthology.org/2023.emnlp-main.956.pdf)
* More Information: [Github](https://github.com/yixuantt/FinEntity)
# Citation
```
@inproceedings{tang-etal-2023-finentity,
title = "{F}in{E}ntity: Entity-level Sentiment Classification for Financial Texts",
author = "Tang, Yixuan and
Yang, Yi and
Huang, Allen and
Tam, Andy and
Tang, Justin",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.956",
doi = "10.18653/v1/2023.emnlp-main.956",
pages = "15465--15471",
abstract = "In the financial domain, conducting entity-level sentiment analysis is crucial for accurately assessing the sentiment directed toward a specific financial entity. To our knowledge, no publicly available dataset currently exists for this purpose. In this work, we introduce an entity-level sentiment classification dataset, called FinEntity, that annotates financial entity spans and their sentiment (positive, neutral, and negative) in financial news. We document the dataset construction process in the paper. Additionally, we benchmark several pre-trained models (BERT, FinBERT, etc.) and ChatGPT on entity-level sentiment classification. In a case study, we demonstrate the practical utility of using FinEntity in monitoring cryptocurrency markets. The data and code of FinEntity is available at https://github.com/yixuantt/FinEntity.",
}
``` | yixuantt/FinEntity | [
"task_categories:token-classification",
"size_categories:1K<n<10K",
"language:en",
"license:odc-by",
"finance",
"region:us"
] | 2024-01-24T00:41:55+00:00 | {"language": ["en"], "license": "odc-by", "size_categories": ["1K<n<10K"], "task_categories": ["token-classification"], "pretty_name": "FinEntity", "tags": ["finance"]} | 2024-01-24T00:57:45+00:00 | [] | [
"en"
] | TAGS
#task_categories-token-classification #size_categories-1K<n<10K #language-English #license-odc-by #finance #region-us
| # FinEntity: A Dataset for entity-level sentiment classification.
In this work, we introduce an entity-level sentiment classification dataset, called FinEntity, that annotates sentiment (positive, neutral, and negative) of individual financial entities in financial news. The dataset construction process is well-documented in the paper.
* Paper: FinEntity: Entity-level Sentiment Classification for Financial Texts
* More Information: Github
| [
"# FinEntity: A Dataset for entity-level sentiment classification.\nIn this work, we introduce an entity-level sentiment classification dataset, called FinEntity, that annotates sentiment (positive, neutral, and negative) of individual financial entities in financial news. The dataset construction process is well-documented in the paper. \n* Paper: FinEntity: Entity-level Sentiment Classification for Financial Texts\n* More Information: Github"
] | [
"TAGS\n#task_categories-token-classification #size_categories-1K<n<10K #language-English #license-odc-by #finance #region-us \n",
"# FinEntity: A Dataset for entity-level sentiment classification.\nIn this work, we introduce an entity-level sentiment classification dataset, called FinEntity, that annotates sentiment (positive, neutral, and negative) of individual financial entities in financial news. The dataset construction process is well-documented in the paper. \n* Paper: FinEntity: Entity-level Sentiment Classification for Financial Texts\n* More Information: Github"
] |
03423cf7ddaf2441204f983d29c3f5e0fa812320 |
[openai_humaneval](https://huggingface.co/datasets/openai_humaneval) dataset, with one-line bugs of various forms in the solutions. These bugs are generated using abstract syntax trees (ASTs) in Python, to randomly sample variables, functions, and expressions in the function body and replace them with other variables, functions and expressions respectively.
The data contains two splits- ```control``` and ```print```. Code for generating ```humaneval-patch``` is provided [here](https://github.com/janphilippfranken/printllama/tree/main/experiments/humaneval-patch/print-insertions). Developed as part of an investigation of language models' ability to utilize print statements to effectively repair buggy code. A detailed description of the dataset construction process is included below.
**control**
To generate this dataset, we begin by considering a pair (*i*, *t*) consisting of a solution *i* from [openai_humaneval](https://huggingface.co/datasets/openai_humaneval) - which contains 164 unique problem-solution pairs - and a perturbation type *t* in {```variable```, ```expression```, ```function```}.
1. Construct an abstract syntax tree for solution *i*.
2. Construct a buggy solution *i'* by randomly replacing a node *n* of type *t* with another node *n'* of type *t*. If *i'* both represents a novel incorrect solution and evaluates without error, add buggy solution *i'* to the dataset and move to problem *i + 1*.
3. Attempt step 2 100 times, exiting the loop as soon as a valid buggy solution is reached. If a valid solution is not reached, move on from pair (*i*, *t*).
We repeat for all pairs (*i*, *t*), and end up with 316 novel buggy solutions to OpenAI's humaneval problems. This is the ```control``` split.
**print**
Now, selecting 30 buggy solutions from the ```control``` split - 10 of each type *t* - we construct "expert" manual print statements that would help a user debug the incorrect solutions.
Considering each buggy solution *j* from the ```control``` split:
1. Sample 3 expert prints from the 30 manual prints above.
2. Prompt GPT-4 to insert a similar print statement for buggy solution *j*.
3. Generate 20 different print insertions *p* per problem
4. Allow [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) to select the print insertion *p* among 20 insertions for problem *j* which leads to the highest bug repair accuracy (where the repair accuracy for a print *p* of interest is calculated as a percentage over 20 repair attempts by the model)
5. Keep the print insertion *p* associated with the highest repair accuracy for problem *j*.
We end up with the 316 buggy solutions from the ```control``` split, each with a 'mixtral-optimal' print statement inserted - this is the ```print``` split. | scandukuri/humaneval-patch | [
"license:mit",
"region:us"
] | 2024-01-24T01:05:08+00:00 | {"license": "mit"} | 2024-01-24T04:21:25+00:00 | [] | [] | TAGS
#license-mit #region-us
|
openai_humaneval dataset, with one-line bugs of various forms in the solutions. These bugs are generated using abstract syntax trees (ASTs) in Python, to randomly sample variables, functions, and expressions in the function body and replace them with other variables, functions and expressions respectively.
The data contains two splits- and . Code for generating is provided here. Developed as part of an investigation of language models' ability to utilize print statements to effectively repair buggy code. A detailed description of the dataset construction process is included below.
control
To generate this dataset, we begin by considering a pair (*i*, *t*) consisting of a solution *i* from openai_humaneval - which contains 164 unique problem-solution pairs - and a perturbation type *t* in {, , }.
1. Construct an abstract syntax tree for solution *i*.
2. Construct a buggy solution *i'* by randomly replacing a node *n* of type *t* with another node *n'* of type *t*. If *i'* both represents a novel incorrect solution and evaluates without error, add buggy solution *i'* to the dataset and move to problem *i + 1*.
3. Attempt step 2 100 times, exiting the loop as soon as a valid buggy solution is reached. If a valid solution is not reached, move on from pair (*i*, *t*).
We repeat for all pairs (*i*, *t*), and end up with 316 novel buggy solutions to OpenAI's humaneval problems. This is the split.
print
Now, selecting 30 buggy solutions from the split - 10 of each type *t* - we construct "expert" manual print statements that would help a user debug the incorrect solutions.
Considering each buggy solution *j* from the split:
1. Sample 3 expert prints from the 30 manual prints above.
2. Prompt GPT-4 to insert a similar print statement for buggy solution *j*.
3. Generate 20 different print insertions *p* per problem
4. Allow mistralai/Mixtral-8x7B-Instruct-v0.1 to select the print insertion *p* among 20 insertions for problem *j* which leads to the highest bug repair accuracy (where the repair accuracy for a print *p* of interest is calculated as a percentage over 20 repair attempts by the model)
5. Keep the print insertion *p* associated with the highest repair accuracy for problem *j*.
We end up with the 316 buggy solutions from the split, each with a 'mixtral-optimal' print statement inserted - this is the split. | [] | [
"TAGS\n#license-mit #region-us \n"
] |
891de8f884ebec54b0fe987366c70df0f659858e |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Hsemih/TrainingPropKNN | [
"region:us"
] | 2024-01-24T01:22:44+00:00 | {} | 2024-01-24T01:30:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b29df2da3a7234bd46e6c84af22d84988f9dd617 |
# Dataset Card for Evaluation run of Aryanne/ereb-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aryanne/ereb-test](https://huggingface.co/Aryanne/ereb-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aryanne__ereb-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T01:34:47.525846](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__ereb-test/blob/main/results_2024-01-24T01-34-47.525846.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2877610576763335,
"acc_stderr": 0.03178333120937811,
"acc_norm": 0.28998601852554035,
"acc_norm_stderr": 0.03262751184295166,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688246,
"mc2": 0.47395581482094956,
"mc2_stderr": 0.0155095731496728
},
"harness|arc:challenge|25": {
"acc": 0.378839590443686,
"acc_stderr": 0.014175915490000322,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009128
},
"harness|hellaswag|10": {
"acc": 0.5435172276438957,
"acc_stderr": 0.00497084669755231,
"acc_norm": 0.7104162517426807,
"acc_norm_stderr": 0.004526422125860678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30566037735849055,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.30566037735849055,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566018,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566018
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.025649381063029265,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.025649381063029265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.02798672466673622,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.02798672466673622
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268048,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268048
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.26262626262626265,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.26262626262626265,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.34196891191709844,
"acc_stderr": 0.03423465100104283,
"acc_norm": 0.34196891191709844,
"acc_norm_stderr": 0.03423465100104283
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30512820512820515,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.30512820512820515,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804725,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.017604304149256487,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.017604304149256487
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.030546745264953178,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.030546745264953178
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484256,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.0305002831765459,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.0305002831765459
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173092,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173092
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3128991060025543,
"acc_stderr": 0.01658093594030406,
"acc_norm": 0.3128991060025543,
"acc_norm_stderr": 0.01658093594030406
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.315028901734104,
"acc_stderr": 0.025009313790069713,
"acc_norm": 0.315028901734104,
"acc_norm_stderr": 0.025009313790069713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261431,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261431
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.026787453111906535,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.026787453111906535
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.33440514469453375,
"acc_stderr": 0.026795422327893947,
"acc_norm": 0.33440514469453375,
"acc_norm_stderr": 0.026795422327893947
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.024847921358063962,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.024847921358063962
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2777053455019557,
"acc_stderr": 0.011438741422769582,
"acc_norm": 0.2777053455019557,
"acc_norm_stderr": 0.011438741422769582
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27205882352941174,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.27205882352941174,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.018550634502952964,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.018550634502952964
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3142857142857143,
"acc_stderr": 0.029719329422417458,
"acc_norm": 0.3142857142857143,
"acc_norm_stderr": 0.029719329422417458
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573005,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573005
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071856,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071856
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.036310534964889056,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.036310534964889056
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688246,
"mc2": 0.47395581482094956,
"mc2_stderr": 0.0155095731496728
},
"harness|winogrande|5": {
"acc": 0.6393054459352802,
"acc_stderr": 0.013496064394234019
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Aryanne__ereb-test | [
"region:us"
] | 2024-01-24T01:37:09+00:00 | {"pretty_name": "Evaluation run of Aryanne/ereb-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aryanne/ereb-test](https://huggingface.co/Aryanne/ereb-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__ereb-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T01:34:47.525846](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__ereb-test/blob/main/results_2024-01-24T01-34-47.525846.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2877610576763335,\n \"acc_stderr\": 0.03178333120937811,\n \"acc_norm\": 0.28998601852554035,\n \"acc_norm_stderr\": 0.03262751184295166,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.01611412415688246,\n \"mc2\": 0.47395581482094956,\n \"mc2_stderr\": 0.0155095731496728\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.378839590443686,\n \"acc_stderr\": 0.014175915490000322,\n \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009128\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5435172276438957,\n \"acc_stderr\": 0.00497084669755231,\n \"acc_norm\": 0.7104162517426807,\n \"acc_norm_stderr\": 0.004526422125860678\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.30566037735849055,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566018,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566018\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n \"acc_stderr\": 0.025649381063029265,\n \"acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.025649381063029265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673622,\n \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673622\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268048,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268048\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.26262626262626265,\n \"acc_stderr\": 0.03135305009533084,\n \"acc_norm\": 0.26262626262626265,\n \"acc_norm_stderr\": 0.03135305009533084\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.34196891191709844,\n \"acc_stderr\": 0.03423465100104283,\n \"acc_norm\": 0.34196891191709844,\n \"acc_norm_stderr\": 0.03423465100104283\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.30512820512820515,\n \"acc_stderr\": 0.023346335293325887,\n \"acc_norm\": 0.30512820512820515,\n \"acc_norm_stderr\": 0.023346335293325887\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21467889908256882,\n \"acc_stderr\": 0.017604304149256487,\n \"acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.017604304149256487\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.030546745264953178,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.030546745264953178\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115071,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115071\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484256,\n \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n \"acc_stderr\": 0.0305002831765459,\n \"acc_norm\": 0.2914798206278027,\n \"acc_norm_stderr\": 0.0305002831765459\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.030236389942173092,\n \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.030236389942173092\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3128991060025543,\n \"acc_stderr\": 0.01658093594030406,\n \"acc_norm\": 0.3128991060025543,\n \"acc_norm_stderr\": 0.01658093594030406\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.315028901734104,\n \"acc_stderr\": 0.025009313790069713,\n \"acc_norm\": 0.315028901734104,\n \"acc_norm_stderr\": 0.025009313790069713\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261431,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261431\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.026787453111906535,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.026787453111906535\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33440514469453375,\n \"acc_stderr\": 0.026795422327893947,\n \"acc_norm\": 0.33440514469453375,\n \"acc_norm_stderr\": 0.026795422327893947\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22340425531914893,\n \"acc_stderr\": 0.024847921358063962,\n \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.024847921358063962\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2777053455019557,\n \"acc_stderr\": 0.011438741422769582,\n \"acc_norm\": 0.2777053455019557,\n \"acc_norm_stderr\": 0.011438741422769582\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.27205882352941174,\n \"acc_stderr\": 0.02703304115168146,\n \"acc_norm\": 0.27205882352941174,\n \"acc_norm_stderr\": 0.02703304115168146\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.018550634502952964,\n \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.018550634502952964\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417458,\n \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417458\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573005,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573005\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.03410646614071856,\n \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.03410646614071856\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.036310534964889056,\n \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.036310534964889056\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.01611412415688246,\n \"mc2\": 0.47395581482094956,\n \"mc2_stderr\": 0.0155095731496728\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6393054459352802,\n \"acc_stderr\": 0.013496064394234019\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Aryanne/ereb-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|arc:challenge|25_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|gsm8k|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hellaswag|10_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T01-34-47.525846.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["**/details_harness|winogrande|5_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T01-34-47.525846.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T01_34_47.525846", "path": ["results_2024-01-24T01-34-47.525846.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T01-34-47.525846.parquet"]}]}]} | 2024-01-24T01:37:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Aryanne/ereb-test
Dataset automatically created during the evaluation run of model Aryanne/ereb-test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T01:34:47.525846(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Aryanne/ereb-test\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/ereb-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T01:34:47.525846(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Aryanne/ereb-test\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/ereb-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T01:34:47.525846(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
59db7abc52c36be0df7521df1ef2f7ffeab16a2d |
# Dataset Card for Evaluation run of Aryanne/sheared-silicon10p
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aryanne/sheared-silicon10p](https://huggingface.co/Aryanne/sheared-silicon10p) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aryanne__sheared-silicon10p",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T01:36:29.411153](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-silicon10p/blob/main/results_2024-01-24T01-36-29.411153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2599470421169613,
"acc_stderr": 0.030910228069377786,
"acc_norm": 0.26177545641423533,
"acc_norm_stderr": 0.03173012245284717,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608774,
"mc2": 0.44852499686717784,
"mc2_stderr": 0.015613508500309613
},
"harness|arc:challenge|25": {
"acc": 0.32764505119453924,
"acc_stderr": 0.013715847940719342,
"acc_norm": 0.36177474402730375,
"acc_norm_stderr": 0.014041957945038073
},
"harness|hellaswag|10": {
"acc": 0.3850826528579964,
"acc_stderr": 0.004856203374715455,
"acc_norm": 0.5111531567416849,
"acc_norm_stderr": 0.004988539870174639
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351586,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351586
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916648,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916648
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2743589743589744,
"acc_stderr": 0.0226227657674932,
"acc_norm": 0.2743589743589744,
"acc_norm_stderr": 0.0226227657674932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275788,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275788
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958955,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958955
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780306,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02988691054762696,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02988691054762696
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484256,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407266,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407266
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749475,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749475
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225586,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225586
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.02558306248998484,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.02558306248998484
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890155,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890155
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2685788787483703,
"acc_stderr": 0.011320056629121734,
"acc_norm": 0.2685788787483703,
"acc_norm_stderr": 0.011320056629121734
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.023886881922440362,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.023886881922440362
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.02752963744017492,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.02752963744017492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078942,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078942
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608774,
"mc2": 0.44852499686717784,
"mc2_stderr": 0.015613508500309613
},
"harness|winogrande|5": {
"acc": 0.5722178374112076,
"acc_stderr": 0.013905134013839955
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Aryanne__sheared-silicon10p | [
"region:us"
] | 2024-01-24T01:38:52+00:00 | {"pretty_name": "Evaluation run of Aryanne/sheared-silicon10p", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aryanne/sheared-silicon10p](https://huggingface.co/Aryanne/sheared-silicon10p) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__sheared-silicon10p\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T01:36:29.411153](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-silicon10p/blob/main/results_2024-01-24T01-36-29.411153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2599470421169613,\n \"acc_stderr\": 0.030910228069377786,\n \"acc_norm\": 0.26177545641423533,\n \"acc_norm_stderr\": 0.03173012245284717,\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608774,\n \"mc2\": 0.44852499686717784,\n \"mc2_stderr\": 0.015613508500309613\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.32764505119453924,\n \"acc_stderr\": 0.013715847940719342,\n \"acc_norm\": 0.36177474402730375,\n \"acc_norm_stderr\": 0.014041957945038073\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3850826528579964,\n \"acc_stderr\": 0.004856203374715455,\n \"acc_norm\": 0.5111531567416849,\n \"acc_norm_stderr\": 0.004988539870174639\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351586,\n \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351586\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039776,\n \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039776\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22580645161290322,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.22580645161290322,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916648,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916648\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2743589743589744,\n \"acc_stderr\": 0.0226227657674932,\n \"acc_norm\": 0.2743589743589744,\n \"acc_norm_stderr\": 0.0226227657674932\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275788,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275788\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958955,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958955\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780306,\n \"acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780306\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02988691054762696,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02988691054762696\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484256,\n \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n \"acc_stderr\": 0.026241132996407266,\n \"acc_norm\": 0.18834080717488788,\n \"acc_norm_stderr\": 0.026241132996407266\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.028911208802749475,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.028911208802749475\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225586,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225586\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n \"acc_stderr\": 0.02558306248998484,\n \"acc_norm\": 0.2829581993569132,\n \"acc_norm_stderr\": 0.02558306248998484\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890155,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890155\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2685788787483703,\n \"acc_stderr\": 0.011320056629121734,\n \"acc_norm\": 0.2685788787483703,\n \"acc_norm_stderr\": 0.011320056629121734\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.023886881922440362,\n \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.023886881922440362\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.02752963744017492,\n \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.02752963744017492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n \"acc_stderr\": 0.03106939026078942,\n \"acc_norm\": 0.19879518072289157,\n \"acc_norm_stderr\": 0.03106939026078942\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608774,\n \"mc2\": 0.44852499686717784,\n \"mc2_stderr\": 0.015613508500309613\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5722178374112076,\n \"acc_stderr\": 0.013905134013839955\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Aryanne/sheared-silicon10p", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|arc:challenge|25_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|gsm8k|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hellaswag|10_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["**/details_harness|winogrande|5_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T01-36-29.411153.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T01_36_29.411153", "path": ["results_2024-01-24T01-36-29.411153.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T01-36-29.411153.parquet"]}]}]} | 2024-01-24T01:39:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Aryanne/sheared-silicon10p
Dataset automatically created during the evaluation run of model Aryanne/sheared-silicon10p on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T01:36:29.411153(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Aryanne/sheared-silicon10p\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/sheared-silicon10p on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T01:36:29.411153(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Aryanne/sheared-silicon10p\n\n\n\nDataset automatically created during the evaluation run of model Aryanne/sheared-silicon10p on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T01:36:29.411153(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bc57ad2af2b326f99366a6c836be7650ddb820b0 |
# AutoMathText
**AutoMathText** is an extensive and carefully curated dataset encompassing around **200 GB** of mathematical texts. It's a compilation sourced from a diverse range of platforms including various websites, arXiv, and GitHub (OpenWebMath, RedPajama, Algebraic Stack). This rich repository has been **autonomously selected (labeled) by the state-of-the-art open-source language model**, Qwen-72B. Each piece of content in the dataset is assigned **a score `lm_q1q2_score` within the range of [0, 1]**, reflecting its relevance, quality and educational value in the context of mathematical intelligence.
GitHub homepage: https://github.com/yifanzhang-pro/AutoMathText
ArXiv paper: https://arxiv.org/abs/2402.07625
## Objective
The primary aim of the **AutoMathText** dataset is to provide a comprehensive and reliable resource for a wide array of users - from academic researchers and educators to AI practitioners and mathematics enthusiasts. This dataset is particularly geared towards:
- Facilitating advanced research in **the intersection of mathematics and artificial intelligence**.
- Serving as an educational tool for **learning and teaching complex mathematical concepts**.
- Providing **a foundation for developing and training AI models** specialized in processing and understanding **mathematical content**.
## Configs
```YAML
configs:
- config_name: web-0.50-to-1.00
data_files:
- split: train
path:
- data/web/0.95-1.00.jsonl
- data/web/0.90-0.95.jsonl
- ...
- data/web/0.50-0.55.jsonl
default: true
- config_name: web-0.60-to-1.00
- config_name: web-0.70-to-1.00
- config_name: web-0.80-to-1.00
- config_name: web-full
data_files: data/web/*.jsonl
- config_name: arxiv-0.50-to-1.00
data_files:
- split: train
path:
- data/arxiv/0.90-1.00/*.jsonl
- ...
- data/arxiv/0.50-0.60/*.jsonl
- config_name: arxiv-0.60-to-1.00
- config_name: arxiv-0.70-to-1.00
- config_name: arxiv-0.80-to-1.00
- config_name: arxiv-full
data_files: data/arxiv/*/*.jsonl
- config_name: code-0.50-to-1.00
data_files:
- split: train
path:
- data/code/*/0.95-1.00.jsonl
- ...
- data/code/*/0.50-0.55.jsonl
- config_name: code-python-0.50-to-1.00
- split: train
path:
- data/code/python/0.95-1.00.jsonl
- ...
- data/code/python/0.50-0.55.jsonl
- config_name: code-python-0.60-to-1.00
- config_name: code-python-0.70-to-1.00
- config_name: code-python-0.80-to-1.00
- config_name: code-full
data_files: data/code/*/*.jsonl
```
How to load data:
```python
from datasets import load_dataset
ds = load_dataset("math-ai/AutoMathText", "web-0.50-to-1.00") # or any valid config_name
```
## Features
- **Volume**: Approximately 200 GB of text data (in natural language and programming language).
- **Content**: A diverse collection of mathematical texts, including but not limited to research papers, educational articles, and code documentation.
- **Labeling**: Every text is **scored** by Qwen-72B, a sophisticated language model, ensuring a high standard of relevance and accuracy.
- **Scope**: Covers a wide spectrum of mathematical topics, making it suitable for various applications in advanced research and education.
## References
- OpenWebMath [[link]](https://huggingface.co/datasets/open-web-math/open-web-math)
- RedPajama [[link]](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
- Algebraick Stack [[link]](https://huggingface.co/datasets/EleutherAI/proof-pile-2) (a subset of Proof-Pile-2)
## Citation
We appreciate your use of **AutoMathText** in your work. If you find this repository helpful, please consider citing it and star this repo. Feel free to contact [email protected] or open an issue if you have any questions (GitHub homepage: https://github.com/yifanzhang-pro/AutoMathText).
```bibtex
@article{zhang2024automathtext,
title={AutoMathText: Autonomous Data Selection with Language Models for Mathematical Texts},
author={Zhang, Yifan and Luo, Yifan and Yuan, Yang and Yao, Andrew Chi-Chih},
journal={arXiv preprint arXiv:2402.07625},
year={2024},
}
```
| math-ai/AutoMathText | [
"task_categories:text-generation",
"task_categories:question-answering",
"size_categories:10B<n<100B",
"language:en",
"license:cc-by-sa-4.0",
"mathematical-reasoning",
"reasoning",
"finetuning",
"pretraining",
"llm",
"arxiv:2402.07625",
"region:us"
] | 2024-01-24T01:39:26+00:00 | {"language": ["en"], "license": "cc-by-sa-4.0", "size_categories": ["10B<n<100B"], "task_categories": ["text-generation", "question-answering"], "pretty_name": "AutoMathText", "configs": [{"config_name": "web-0.50-to-1.00", "data_files": [{"split": "train", "path": ["data/web/0.95-1.00.jsonl", "data/web/0.90-0.95.jsonl", "data/web/0.85-0.90.jsonl", "data/web/0.80-0.85.jsonl", "data/web/0.75-0.80.jsonl", "data/web/0.70-0.75.jsonl", "data/web/0.65-0.70.jsonl", "data/web/0.60-0.65.jsonl", "data/web/0.55-0.60.jsonl", "data/web/0.50-0.55.jsonl"]}], "default": true}, {"config_name": "web-0.60-to-1.00", "data_files": [{"split": "train", "path": ["data/web/0.95-1.00.jsonl", "data/web/0.90-0.95.jsonl", "data/web/0.85-0.90.jsonl", "data/web/0.80-0.85.jsonl", "data/web/0.75-0.80.jsonl", "data/web/0.70-0.75.jsonl", "data/web/0.65-0.70.jsonl", "data/web/0.60-0.65.jsonl"]}]}, {"config_name": "web-0.70-to-1.00", "data_files": [{"split": "train", "path": ["data/web/0.95-1.00.jsonl", "data/web/0.90-0.95.jsonl", "data/web/0.85-0.90.jsonl", "data/web/0.80-0.85.jsonl", "data/web/0.75-0.80.jsonl", "data/web/0.70-0.75.jsonl"]}]}, {"config_name": "web-0.80-to-1.00", "data_files": [{"split": "train", "path": ["data/web/0.95-1.00.jsonl", "data/web/0.90-0.95.jsonl", "data/web/0.85-0.90.jsonl", "data/web/0.80-0.85.jsonl"]}]}, {"config_name": "web-full", "data_files": "data/web/*.jsonl"}, {"config_name": "arxiv-0.50-to-1.00", "data_files": [{"split": "train", "path": ["data/arxiv/0.90-1.00/*.jsonl", "data/arxiv/0.80-0.90/*.jsonl", "data/arxiv/0.70-0.80/*.jsonl", "data/arxiv/0.60-0.70/*.jsonl", "data/arxiv/0.50-0.60/*.jsonl"]}]}, {"config_name": "arxiv-0.60-to-1.00", "data_files": [{"split": "train", "path": ["data/arxiv/0.90-1.00/*.jsonl", "data/arxiv/0.80-0.90/*.jsonl", "data/arxiv/0.70-0.80/*.jsonl", "data/arxiv/0.60-0.70/*.jsonl"]}]}, {"config_name": "arxiv-0.70-to-1.00", "data_files": [{"split": "train", "path": ["data/arxiv/0.90-1.00/*.jsonl", "data/arxiv/0.80-0.90/*.jsonl", "data/arxiv/0.70-0.80/*.jsonl"]}]}, {"config_name": "arxiv-0.80-to-1.00", "data_files": [{"split": "train", "path": ["data/arxiv/0.90-1.00/*.jsonl", "data/arxiv/0.80-0.90/*.jsonl"]}]}, {"config_name": "arxiv-full", "data_files": [{"split": "train", "path": ["data/arxiv/0.90-1.00/*.jsonl", "data/arxiv/0.80-0.90/*.jsonl", "data/arxiv/0.70-0.80/*.jsonl", "data/arxiv/0.60-0.70/*.jsonl", "data/arxiv/0.50-0.60/*.jsonl", "data/arxiv/0.00-0.50/*.jsonl"]}]}, {"config_name": "code-0.50-to-1.00", "data_files": [{"split": "train", "path": ["data/code/agda/0.95-1.00.jsonl", "data/code/agda/0.90-0.95.jsonl", "data/code/agda/0.85-0.90.jsonl", "data/code/agda/0.80-0.85.jsonl", "data/code/agda/0.75-0.80.jsonl", "data/code/agda/0.70-0.75.jsonl", "data/code/agda/0.65-0.70.jsonl", "data/code/agda/0.60-0.65.jsonl", "data/code/agda/0.55-0.60.jsonl", "data/code/agda/0.50-0.55.jsonl", "data/code/c/0.95-1.00.jsonl", "data/code/c/0.90-0.95.jsonl", "data/code/c/0.85-0.90.jsonl", "data/code/c/0.80-0.85.jsonl", "data/code/c/0.75-0.80.jsonl", "data/code/c/0.70-0.75.jsonl", "data/code/c/0.65-0.70.jsonl", "data/code/c/0.60-0.65.jsonl", "data/code/c/0.55-0.60.jsonl", "data/code/c/0.50-0.55.jsonl", "data/code/cpp/0.95-1.00.jsonl", "data/code/cpp/0.90-0.95.jsonl", "data/code/cpp/0.85-0.90.jsonl", "data/code/cpp/0.80-0.85.jsonl", "data/code/cpp/0.75-0.80.jsonl", "data/code/cpp/0.70-0.75.jsonl", "data/code/cpp/0.65-0.70.jsonl", "data/code/cpp/0.60-0.65.jsonl", "data/code/cpp/0.55-0.60.jsonl", "data/code/cpp/0.50-0.55.jsonl", "data/code/fortran/0.95-1.00.jsonl", "data/code/fortran/0.90-0.95.jsonl", "data/code/fortran/0.85-0.90.jsonl", "data/code/fortran/0.80-0.85.jsonl", "data/code/fortran/0.75-0.80.jsonl", "data/code/fortran/0.70-0.75.jsonl", "data/code/fortran/0.65-0.70.jsonl", "data/code/fortran/0.60-0.65.jsonl", "data/code/fortran/0.55-0.60.jsonl", "data/code/fortran/0.50-0.55.jsonl", "data/code/gap/0.95-1.00.jsonl", "data/code/gap/0.90-0.95.jsonl", "data/code/gap/0.85-0.90.jsonl", "data/code/gap/0.80-0.85.jsonl", "data/code/gap/0.75-0.80.jsonl", "data/code/gap/0.70-0.75.jsonl", "data/code/gap/0.65-0.70.jsonl", "data/code/gap/0.60-0.65.jsonl", "data/code/gap/0.55-0.60.jsonl", "data/code/gap/0.50-0.55.jsonl", "data/code/github-coq-train/0.95-1.00.jsonl", "data/code/github-coq-train/0.90-0.95.jsonl", "data/code/github-coq-train/0.85-0.90.jsonl", "data/code/github-coq-train/0.80-0.85.jsonl", "data/code/github-coq-train/0.75-0.80.jsonl", "data/code/github-coq-train/0.70-0.75.jsonl", "data/code/github-coq-train/0.65-0.70.jsonl", "data/code/github-coq-train/0.60-0.65.jsonl", "data/code/github-coq-train/0.55-0.60.jsonl", "data/code/github-coq-train/0.50-0.55.jsonl", "data/code/github-isabelle-train/0.95-1.00.jsonl", "data/code/github-isabelle-train/0.90-0.95.jsonl", "data/code/github-isabelle-train/0.85-0.90.jsonl", "data/code/github-isabelle-train/0.80-0.85.jsonl", "data/code/github-isabelle-train/0.75-0.80.jsonl", "data/code/github-isabelle-train/0.70-0.75.jsonl", "data/code/github-isabelle-train/0.65-0.70.jsonl", "data/code/github-isabelle-train/0.60-0.65.jsonl", "data/code/github-isabelle-train/0.55-0.60.jsonl", "data/code/github-isabelle-train/0.50-0.55.jsonl", "data/code/github-lean-train/0.95-1.00.jsonl", "data/code/github-lean-train/0.90-0.95.jsonl", "data/code/github-lean-train/0.85-0.90.jsonl", "data/code/github-lean-train/0.80-0.85.jsonl", "data/code/github-lean-train/0.75-0.80.jsonl", "data/code/github-lean-train/0.70-0.75.jsonl", "data/code/github-lean-train/0.65-0.70.jsonl", "data/code/github-lean-train/0.60-0.65.jsonl", "data/code/github-lean-train/0.55-0.60.jsonl", "data/code/github-lean-train/0.50-0.55.jsonl", "data/code/github-MATLAB-train/0.95-1.00.jsonl", "data/code/github-MATLAB-train/0.90-0.95.jsonl", "data/code/github-MATLAB-train/0.85-0.90.jsonl", "data/code/github-MATLAB-train/0.80-0.85.jsonl", "data/code/github-MATLAB-train/0.75-0.80.jsonl", "data/code/github-MATLAB-train/0.70-0.75.jsonl", "data/code/github-MATLAB-train/0.65-0.70.jsonl", "data/code/github-MATLAB-train/0.60-0.65.jsonl", "data/code/github-MATLAB-train/0.55-0.60.jsonl", "data/code/github-MATLAB-train/0.50-0.55.jsonl", "data/code/haskell/0.95-1.00.jsonl", "data/code/haskell/0.90-0.95.jsonl", "data/code/haskell/0.85-0.90.jsonl", "data/code/haskell/0.80-0.85.jsonl", "data/code/haskell/0.75-0.80.jsonl", "data/code/haskell/0.70-0.75.jsonl", "data/code/haskell/0.65-0.70.jsonl", "data/code/haskell/0.60-0.65.jsonl", "data/code/haskell/0.55-0.60.jsonl", "data/code/haskell/0.50-0.55.jsonl", "data/code/idris/0.95-1.00.jsonl", "data/code/idris/0.90-0.95.jsonl", "data/code/idris/0.85-0.90.jsonl", "data/code/idris/0.80-0.85.jsonl", "data/code/idris/0.75-0.80.jsonl", "data/code/idris/0.70-0.75.jsonl", "data/code/idris/0.65-0.70.jsonl", "data/code/idris/0.60-0.65.jsonl", "data/code/idris/0.55-0.60.jsonl", "data/code/idris/0.50-0.55.jsonl", "data/code/isa_proofsteps/0.95-1.00.jsonl", "data/code/isa_proofsteps/0.90-0.95.jsonl", "data/code/isa_proofsteps/0.85-0.90.jsonl", "data/code/isa_proofsteps/0.80-0.85.jsonl", "data/code/isa_proofsteps/0.75-0.80.jsonl", "data/code/isa_proofsteps/0.70-0.75.jsonl", "data/code/isa_proofsteps/0.65-0.70.jsonl", "data/code/isa_proofsteps/0.60-0.65.jsonl", "data/code/isa_proofsteps/0.55-0.60.jsonl", "data/code/isa_proofsteps/0.50-0.55.jsonl", "data/code/julia/0.95-1.00.jsonl", "data/code/julia/0.90-0.95.jsonl", "data/code/julia/0.85-0.90.jsonl", "data/code/julia/0.80-0.85.jsonl", "data/code/julia/0.75-0.80.jsonl", "data/code/julia/0.70-0.75.jsonl", "data/code/julia/0.65-0.70.jsonl", "data/code/julia/0.60-0.65.jsonl", "data/code/julia/0.55-0.60.jsonl", "data/code/julia/0.50-0.55.jsonl", "data/code/jupyter-notebook/0.95-1.00.jsonl", "data/code/jupyter-notebook/0.90-0.95.jsonl", "data/code/jupyter-notebook/0.85-0.90.jsonl", "data/code/jupyter-notebook/0.80-0.85.jsonl", "data/code/jupyter-notebook/0.75-0.80.jsonl", "data/code/jupyter-notebook/0.70-0.75.jsonl", "data/code/jupyter-notebook/0.65-0.70.jsonl", "data/code/jupyter-notebook/0.60-0.65.jsonl", "data/code/jupyter-notebook/0.55-0.60.jsonl", "data/code/jupyter-notebook/0.50-0.55.jsonl", "data/code/lean_proofsteps/0.95-1.00.jsonl", "data/code/lean_proofsteps/0.90-0.95.jsonl", "data/code/lean_proofsteps/0.85-0.90.jsonl", "data/code/lean_proofsteps/0.80-0.85.jsonl", "data/code/lean_proofsteps/0.75-0.80.jsonl", "data/code/lean_proofsteps/0.70-0.75.jsonl", "data/code/lean_proofsteps/0.65-0.70.jsonl", "data/code/lean_proofsteps/0.60-0.65.jsonl", "data/code/lean_proofsteps/0.55-0.60.jsonl", "data/code/lean_proofsteps/0.50-0.55.jsonl", "data/code/maple/0.95-1.00.jsonl", "data/code/maple/0.90-0.95.jsonl", "data/code/maple/0.85-0.90.jsonl", "data/code/maple/0.80-0.85.jsonl", "data/code/maple/0.75-0.80.jsonl", "data/code/maple/0.70-0.75.jsonl", "data/code/maple/0.65-0.70.jsonl", "data/code/maple/0.60-0.65.jsonl", "data/code/maple/0.55-0.60.jsonl", "data/code/maple/0.50-0.55.jsonl", "data/code/python/0.95-1.00.jsonl", "data/code/python/0.90-0.95.jsonl", "data/code/python/0.85-0.90.jsonl", "data/code/python/0.80-0.85.jsonl", "data/code/python/0.75-0.80.jsonl", "data/code/python/0.70-0.75.jsonl", "data/code/python/0.65-0.70.jsonl", "data/code/python/0.60-0.65.jsonl", "data/code/python/0.55-0.60.jsonl", "data/code/python/0.50-0.55.jsonl", "data/code/r/0.95-1.00.jsonl", "data/code/r/0.90-0.95.jsonl", "data/code/r/0.85-0.90.jsonl", "data/code/r/0.80-0.85.jsonl", "data/code/r/0.75-0.80.jsonl", "data/code/r/0.70-0.75.jsonl", "data/code/r/0.65-0.70.jsonl", "data/code/r/0.60-0.65.jsonl", "data/code/r/0.55-0.60.jsonl", "data/code/r/0.50-0.55.jsonl", "data/code/tex/0.95-1.00.jsonl", "data/code/tex/0.90-0.95.jsonl", "data/code/tex/0.85-0.90.jsonl", "data/code/tex/0.80-0.85.jsonl", "data/code/tex/0.75-0.80.jsonl", "data/code/tex/0.70-0.75.jsonl", "data/code/tex/0.65-0.70.jsonl", "data/code/tex/0.60-0.65.jsonl", "data/code/tex/0.55-0.60.jsonl", "data/code/tex/0.50-0.55.jsonl"]}]}, {"config_name": "code-python-0.50-to-1.00", "data_files": [{"split": "train", "path": ["data/code/python/0.95-1.00.jsonl", "data/code/python/0.90-0.95.jsonl", "data/code/python/0.85-0.90.jsonl", "data/code/python/0.80-0.85.jsonl", "data/code/python/0.75-0.80.jsonl", "data/code/python/0.70-0.75.jsonl", "data/code/python/0.65-0.70.jsonl", "data/code/python/0.60-0.65.jsonl", "data/code/python/0.55-0.60.jsonl", "data/code/python/0.50-0.55.jsonl"]}]}, {"config_name": "code-python-0.60-to-1.00", "data_files": [{"split": "train", "path": ["data/code/python/0.95-1.00.jsonl", "data/code/python/0.90-0.95.jsonl", "data/code/python/0.85-0.90.jsonl", "data/code/python/0.80-0.85.jsonl", "data/code/python/0.75-0.80.jsonl", "data/code/python/0.70-0.75.jsonl", "data/code/python/0.65-0.70.jsonl", "data/code/python/0.60-0.65.jsonl"]}]}, {"config_name": "code-python-0.70-to-1.00", "data_files": [{"split": "train", "path": ["data/code/python/0.95-1.00.jsonl", "data/code/python/0.90-0.95.jsonl", "data/code/python/0.85-0.90.jsonl", "data/code/python/0.80-0.85.jsonl", "data/code/python/0.75-0.80.jsonl", "data/code/python/0.70-0.75.jsonl"]}]}, {"config_name": "code-python-0.80-to-1.00", "data_files": [{"split": "train", "path": ["data/code/python/0.95-1.00.jsonl", "data/code/python/0.90-0.95.jsonl", "data/code/python/0.85-0.90.jsonl", "data/code/python/0.80-0.85.jsonl"]}]}, {"config_name": "code-full", "data_files": [{"split": "train", "path": ["data/code/*/*.jsonl"]}]}], "tags": ["mathematical-reasoning", "reasoning", "finetuning", "pretraining", "llm"]} | 2024-02-14T09:01:18+00:00 | [
"2402.07625"
] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-question-answering #size_categories-10B<n<100B #language-English #license-cc-by-sa-4.0 #mathematical-reasoning #reasoning #finetuning #pretraining #llm #arxiv-2402.07625 #region-us
|
# AutoMathText
AutoMathText is an extensive and carefully curated dataset encompassing around 200 GB of mathematical texts. It's a compilation sourced from a diverse range of platforms including various websites, arXiv, and GitHub (OpenWebMath, RedPajama, Algebraic Stack). This rich repository has been autonomously selected (labeled) by the state-of-the-art open-source language model, Qwen-72B. Each piece of content in the dataset is assigned a score 'lm_q1q2_score' within the range of [0, 1], reflecting its relevance, quality and educational value in the context of mathematical intelligence.
GitHub homepage: URL
ArXiv paper: URL
## Objective
The primary aim of the AutoMathText dataset is to provide a comprehensive and reliable resource for a wide array of users - from academic researchers and educators to AI practitioners and mathematics enthusiasts. This dataset is particularly geared towards:
- Facilitating advanced research in the intersection of mathematics and artificial intelligence.
- Serving as an educational tool for learning and teaching complex mathematical concepts.
- Providing a foundation for developing and training AI models specialized in processing and understanding mathematical content.
## Configs
How to load data:
## Features
- Volume: Approximately 200 GB of text data (in natural language and programming language).
- Content: A diverse collection of mathematical texts, including but not limited to research papers, educational articles, and code documentation.
- Labeling: Every text is scored by Qwen-72B, a sophisticated language model, ensuring a high standard of relevance and accuracy.
- Scope: Covers a wide spectrum of mathematical topics, making it suitable for various applications in advanced research and education.
## References
- OpenWebMath [[link]](URL
- RedPajama [[link]](URL
- Algebraick Stack [[link]](URL (a subset of Proof-Pile-2)
We appreciate your use of AutoMathText in your work. If you find this repository helpful, please consider citing it and star this repo. Feel free to contact zhangyif21@URL or open an issue if you have any questions (GitHub homepage: URL
| [
"# AutoMathText\n\nAutoMathText is an extensive and carefully curated dataset encompassing around 200 GB of mathematical texts. It's a compilation sourced from a diverse range of platforms including various websites, arXiv, and GitHub (OpenWebMath, RedPajama, Algebraic Stack). This rich repository has been autonomously selected (labeled) by the state-of-the-art open-source language model, Qwen-72B. Each piece of content in the dataset is assigned a score 'lm_q1q2_score' within the range of [0, 1], reflecting its relevance, quality and educational value in the context of mathematical intelligence.\n\nGitHub homepage: URL\n\nArXiv paper: URL",
"## Objective\n\nThe primary aim of the AutoMathText dataset is to provide a comprehensive and reliable resource for a wide array of users - from academic researchers and educators to AI practitioners and mathematics enthusiasts. This dataset is particularly geared towards:\n\n- Facilitating advanced research in the intersection of mathematics and artificial intelligence.\n- Serving as an educational tool for learning and teaching complex mathematical concepts.\n- Providing a foundation for developing and training AI models specialized in processing and understanding mathematical content.",
"## Configs\n\n\n\nHow to load data:",
"## Features\n\n- Volume: Approximately 200 GB of text data (in natural language and programming language).\n- Content: A diverse collection of mathematical texts, including but not limited to research papers, educational articles, and code documentation.\n- Labeling: Every text is scored by Qwen-72B, a sophisticated language model, ensuring a high standard of relevance and accuracy.\n- Scope: Covers a wide spectrum of mathematical topics, making it suitable for various applications in advanced research and education.",
"## References\n\n- OpenWebMath [[link]](URL\n- RedPajama [[link]](URL\n- Algebraick Stack [[link]](URL (a subset of Proof-Pile-2)\n\nWe appreciate your use of AutoMathText in your work. If you find this repository helpful, please consider citing it and star this repo. Feel free to contact zhangyif21@URL or open an issue if you have any questions (GitHub homepage: URL"
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-10B<n<100B #language-English #license-cc-by-sa-4.0 #mathematical-reasoning #reasoning #finetuning #pretraining #llm #arxiv-2402.07625 #region-us \n",
"# AutoMathText\n\nAutoMathText is an extensive and carefully curated dataset encompassing around 200 GB of mathematical texts. It's a compilation sourced from a diverse range of platforms including various websites, arXiv, and GitHub (OpenWebMath, RedPajama, Algebraic Stack). This rich repository has been autonomously selected (labeled) by the state-of-the-art open-source language model, Qwen-72B. Each piece of content in the dataset is assigned a score 'lm_q1q2_score' within the range of [0, 1], reflecting its relevance, quality and educational value in the context of mathematical intelligence.\n\nGitHub homepage: URL\n\nArXiv paper: URL",
"## Objective\n\nThe primary aim of the AutoMathText dataset is to provide a comprehensive and reliable resource for a wide array of users - from academic researchers and educators to AI practitioners and mathematics enthusiasts. This dataset is particularly geared towards:\n\n- Facilitating advanced research in the intersection of mathematics and artificial intelligence.\n- Serving as an educational tool for learning and teaching complex mathematical concepts.\n- Providing a foundation for developing and training AI models specialized in processing and understanding mathematical content.",
"## Configs\n\n\n\nHow to load data:",
"## Features\n\n- Volume: Approximately 200 GB of text data (in natural language and programming language).\n- Content: A diverse collection of mathematical texts, including but not limited to research papers, educational articles, and code documentation.\n- Labeling: Every text is scored by Qwen-72B, a sophisticated language model, ensuring a high standard of relevance and accuracy.\n- Scope: Covers a wide spectrum of mathematical topics, making it suitable for various applications in advanced research and education.",
"## References\n\n- OpenWebMath [[link]](URL\n- RedPajama [[link]](URL\n- Algebraick Stack [[link]](URL (a subset of Proof-Pile-2)\n\nWe appreciate your use of AutoMathText in your work. If you find this repository helpful, please consider citing it and star this repo. Feel free to contact zhangyif21@URL or open an issue if you have any questions (GitHub homepage: URL"
] |
d618b088b7bbda1c4f6241ecfc122c530d859730 |
# Dataset Card for Evaluation run of Tensoic/Kan-Llama-SFT-v0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Tensoic/Kan-Llama-SFT-v0.5](https://huggingface.co/Tensoic/Kan-Llama-SFT-v0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T01:43:44.197286](https://huggingface.co/datasets/open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5/blob/main/results_2024-01-24T01-43-44.197286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4272736498001841,
"acc_stderr": 0.03426594520244024,
"acc_norm": 0.43301807406696846,
"acc_norm_stderr": 0.03509638961981207,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916912,
"mc2": 0.4744031768522622,
"mc2_stderr": 0.015238059013971565
},
"harness|arc:challenge|25": {
"acc": 0.42918088737201365,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.47440273037542663,
"acc_norm_stderr": 0.01459223088529896
},
"harness|hellaswag|10": {
"acc": 0.5372435769766979,
"acc_stderr": 0.00497591966511654,
"acc_norm": 0.7271459868552081,
"acc_norm_stderr": 0.004445160997618376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389177,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44516129032258067,
"acc_stderr": 0.028272410186214906,
"acc_norm": 0.44516129032258067,
"acc_norm_stderr": 0.028272410186214906
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03815494308688929,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03815494308688929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056128,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056128
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396976,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095932,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095932
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5229357798165137,
"acc_stderr": 0.0214147570581755,
"acc_norm": 0.5229357798165137,
"acc_norm_stderr": 0.0214147570581755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.032259413526312945,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.032259413526312945
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.034849415144292316,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.034849415144292316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.030964810588786713,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.030964810588786713
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44785276073619634,
"acc_stderr": 0.03906947479456601,
"acc_norm": 0.44785276073619634,
"acc_norm_stderr": 0.03906947479456601
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458934,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458934
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.0311669573672359,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.0311669573672359
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5696040868454662,
"acc_stderr": 0.017705868776292388,
"acc_norm": 0.5696040868454662,
"acc_norm_stderr": 0.017705868776292388
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4421965317919075,
"acc_stderr": 0.0267386036438074,
"acc_norm": 0.4421965317919075,
"acc_norm_stderr": 0.0267386036438074
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761976,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761976
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.028541722692618874,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.028541722692618874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5144694533762058,
"acc_stderr": 0.028386198084177673,
"acc_norm": 0.5144694533762058,
"acc_norm_stderr": 0.028386198084177673
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4228395061728395,
"acc_stderr": 0.027487472980871605,
"acc_norm": 0.4228395061728395,
"acc_norm_stderr": 0.027487472980871605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3272490221642764,
"acc_stderr": 0.011983819806464733,
"acc_norm": 0.3272490221642764,
"acc_norm_stderr": 0.011983819806464733
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42320261437908496,
"acc_stderr": 0.01998780976948206,
"acc_norm": 0.42320261437908496,
"acc_norm_stderr": 0.01998780976948206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123935,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123935
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322416,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322416
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.03786720706234214,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.03786720706234214
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916912,
"mc2": 0.4744031768522622,
"mc2_stderr": 0.015238059013971565
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634472
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.0062163286402380944
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5 | [
"region:us"
] | 2024-01-24T01:46:09+00:00 | {"pretty_name": "Evaluation run of Tensoic/Kan-Llama-SFT-v0.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [Tensoic/Kan-Llama-SFT-v0.5](https://huggingface.co/Tensoic/Kan-Llama-SFT-v0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T01:43:44.197286](https://huggingface.co/datasets/open-llm-leaderboard/details_Tensoic__Kan-Llama-SFT-v0.5/blob/main/results_2024-01-24T01-43-44.197286.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4272736498001841,\n \"acc_stderr\": 0.03426594520244024,\n \"acc_norm\": 0.43301807406696846,\n \"acc_norm_stderr\": 0.03509638961981207,\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.4744031768522622,\n \"mc2_stderr\": 0.015238059013971565\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.42918088737201365,\n \"acc_stderr\": 0.014464085894870653,\n \"acc_norm\": 0.47440273037542663,\n \"acc_norm_stderr\": 0.01459223088529896\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5372435769766979,\n \"acc_stderr\": 0.00497591966511654,\n \"acc_norm\": 0.7271459868552081,\n \"acc_norm_stderr\": 0.004445160997618376\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389177,\n \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389177\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.4236111111111111,\n \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.44516129032258067,\n \"acc_stderr\": 0.028272410186214906,\n \"acc_norm\": 0.44516129032258067,\n \"acc_norm_stderr\": 0.028272410186214906\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03815494308688929,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03815494308688929\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056128,\n \"acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056128\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396976,\n \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095932,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095932\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.03156663099215416,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.03156663099215416\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5229357798165137,\n \"acc_stderr\": 0.0214147570581755,\n \"acc_norm\": 0.5229357798165137,\n \"acc_norm_stderr\": 0.0214147570581755\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.032259413526312945,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.032259413526312945\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.034849415144292316,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.034849415144292316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6540084388185654,\n \"acc_stderr\": 0.030964810588786713,\n \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.030964810588786713\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.03906947479456601,\n \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.03906947479456601\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458934,\n \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458934\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.0311669573672359,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.0311669573672359\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5696040868454662,\n \"acc_stderr\": 0.017705868776292388,\n \"acc_norm\": 0.5696040868454662,\n \"acc_norm_stderr\": 0.017705868776292388\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4421965317919075,\n \"acc_stderr\": 0.0267386036438074,\n \"acc_norm\": 0.4421965317919075,\n \"acc_norm_stderr\": 0.0267386036438074\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761976,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761976\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.028541722692618874,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.028541722692618874\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5144694533762058,\n \"acc_stderr\": 0.028386198084177673,\n \"acc_norm\": 0.5144694533762058,\n \"acc_norm_stderr\": 0.028386198084177673\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4228395061728395,\n \"acc_stderr\": 0.027487472980871605,\n \"acc_norm\": 0.4228395061728395,\n \"acc_norm_stderr\": 0.027487472980871605\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3272490221642764,\n \"acc_stderr\": 0.011983819806464733,\n \"acc_norm\": 0.3272490221642764,\n \"acc_norm_stderr\": 0.011983819806464733\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.42320261437908496,\n \"acc_stderr\": 0.01998780976948206,\n \"acc_norm\": 0.42320261437908496,\n \"acc_norm_stderr\": 0.01998780976948206\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278985,\n \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.03093285879278985\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n \"acc_stderr\": 0.03512310964123935,\n \"acc_norm\": 0.5572139303482587,\n \"acc_norm_stderr\": 0.03512310964123935\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322416,\n \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322416\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.03786720706234214,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.03786720706234214\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.4744031768522622,\n \"mc2_stderr\": 0.015238059013971565\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634472\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \"acc_stderr\": 0.0062163286402380944\n }\n}\n```", "repo_url": "https://huggingface.co/Tensoic/Kan-Llama-SFT-v0.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|arc:challenge|25_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|gsm8k|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hellaswag|10_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["**/details_harness|winogrande|5_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T01-43-44.197286.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T01_43_44.197286", "path": ["results_2024-01-24T01-43-44.197286.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T01-43-44.197286.parquet"]}]}]} | 2024-01-24T01:46:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Tensoic/Kan-Llama-SFT-v0.5
Dataset automatically created during the evaluation run of model Tensoic/Kan-Llama-SFT-v0.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T01:43:44.197286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Tensoic/Kan-Llama-SFT-v0.5\n\n\n\nDataset automatically created during the evaluation run of model Tensoic/Kan-Llama-SFT-v0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T01:43:44.197286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Tensoic/Kan-Llama-SFT-v0.5\n\n\n\nDataset automatically created during the evaluation run of model Tensoic/Kan-Llama-SFT-v0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T01:43:44.197286(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
95777bfddb0ae7f885ac816c5b2341ff7aa52d22 | # Refugee Law Lab: Luck of the Draw III: Data
## Dataset Summary
The [Refugee Law Lab](https://refugeelab.ca) supports bulk open-access to Canadian legal data to facilitate research and advocacy.
Bulk open-access helps avoid asymmetrical access-to-justice and amplification of marginalization that
results when commercial actors leverage proprietary
legal datasets for profit -- a particular concern in the border control setting.
This is the dataset used for a [research project](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4322881) published in the Queen's Law Journal, undertaken at the Refugee Law Lab about outcomes in stays of removal in Canada's
Federal Court. Specifically, it includes information from the online Federal Court dockets for all immigration law cases filed between
1997 and 2022.
The dataset can be used for legal analytics (i.e. identifying patterns in legal
decision-making), to test ML and NLP tools on a bilingual dataset of Canadian legal materials, and to
pretrain language models for various tasks.
## Dataset Structure
### Data Instance
The datset includes a single data instance of all online Federal Court dockets involving immigration law filed between 1997 and 2022,
as they appeared when the data was gathered in November 2022.
### Data Fields
Data fields match the formart used for the Refugee Law Lab's [Canadian Legal Data dataset](https://huggingface.co/datasets/refugee-law-lab/canadian-legal-data).
- citation (string): Legal citation for the document (neutral citation where available). In this dataset, the legal citaiton is the docket number, which is a identifer for the file assigned by the Federal Court. Docket numbers take the form IMM-#-YY. IMM signals that this is an immigration law docket, # is a sequential number starting at 1 that represents the order in which applications were received in a given year, and YY is the last two digits of the year in which the application was initially filed.
- year (int32): Year of the document date, which can be useful for filtering. For this dataset, the year is the year when the application was initially filed.
- name (string): Name of the document, in this dataset the style of cause of a cour file
- date_filed (string): Date of the document (yyyy-mm-dd). In this dataset the year is the date the application was filed.
- city_filed (string): City where the application was initially filed
- nature (string): A category of proceedings assigned by the Federal Court
- class (string): A second category of proceedings assigned by the Federal court
- track (string): A third category of proceedings assigned by the Federal Court
- documents (list of dictionaries): A list of dictionaries containing each docket entry (or row in the table of docket entries in a docket). Each dictionary contains the following key/value pairs:
* RE_NO: The number assigned to the docket entry by the Federal Court
* DOCNO: Where the entry involves the filing of a document, the number assigned to that document by the Federal Court
* DOC_DT: The date of the docket entry
* RECORDED_ENTRY: The content of the docket entry
- source_url (string): URL where the document was scraped and where the official version can be found
- scraped_timestamp (string): Date the document was scraped (yyyy-mm-dd)
### Data Languages
Some dockets are in English, some in French, and some alternate between English and French
### Data Splits
The data has not been split, so all data is in the train split.
### Data Loading
To load the data:
```python
from datasets import load_dataset
dataset = load_dataset("refugee-law-lab/luck-of-the-draw-iii", split="train")
```
To convert to dataframe:
```python
from datasets import load_dataset
dataset = load_dataset("refugee-law-lab/luck-of-the-draw-iii", split="train")
```
## Dataset Creation
### Curation Rationale
The dataset includes all Federal Court immigration law dockets available on the Federal Court's website at the time of research (November 2022). The Refugee Law Lab gathered this data for several projects, including the [Refugee Law Lab Portal](https://rllp.ca/) and the research article on Federal Court stays linked above.
### Source Data
#### Source
All data was gathered via the Federal Court's [website](https://www.fct-cf.gc.ca/en/home).
#### Initial Data Collection and Normalization
Details are available via links on the Refugee Law Lab's Github respository [Luck of the Draw III: Code & Data]
(https://github.com/Refugee-Law-Lab/luck-of-the-draw-iii).
### Personal and Sensitive Information
Documents may include personal and sensitive information. All documents have been published online by the Federal Court. While the open court principle mandates
that court materials be made available to the public, there are privacy risks when these
materials become easily and widely available. These privacy risks are particularly acute for marginalized groups,
including refugees and other non-citizens whose personal and sensitive information is included in some of the
documents in this dataset. For example, imagine a repressive government working with private data aggregators to
collect information that is used to target families of political opponents who have sought asylum abroad.
One mechanism used to try to achieve a balance between the open court principle
and privacy is that in publishing the documents in this dataset, the relevant courts and tribunals prohibit
search engines from indexing the documents. Users of this data are required to do the same.
### Non-Official Versions
Documents included in this dataset are unofficial copies. For official versions published by
the Government of Canada, please see the source URLs.
### Non-Affiliation / Endorsement
The reproduction of documents in this dataset was not done in affiliation with, or with the endorsement of
the Federal Court or the Government of Canada.
## Considerations for Using the Data
### Social Impact of Dataset
The Refugee Law Lab recognizes that this dataset -- and further research using the dataset -- raises challenging
questions about how to balance protecting privacy, enhancing government transparency, addressing information
asymmetries, and building technologies that leverage data to advance the rights and interests of
refugees and other displaced people, as well as assisting those working with them (rather than technologies that
[enhance the power of states](https://citizenlab.ca/2018/09/bots-at-the-gate-human-rights-analysis-automated-decision-making-in-canadas-immigration-refugee-system/)
to control the movement of people across borders).
More broadly, the Refugee Law Lab also recognizes that considerations around privacy and data protection are complex
and evolving. When working on migration, refugee law, data, technology and surveillance, we strive to foreground
intersectional understandings of the systemic harms perpetuated against groups historically made marginalized. We
encourage other users to do the same.
We also encourage users to try to avoid participating in building technologies that harm refugees and other
marginalized groups, as well as to connect with [community organizations](https://www.migrationtechmonitor.com/ways-to-help)
working in this space, and to [listen directly](https://www.migrationtechmonitor.com/about-us) and learn from people who are affected by new technologies.
We will review the use these datasets periodically to examine whether continuing to publicly release these datasets achieves
the Refugee Law Lab's goals of advancing the rights and interests of refugees and other marginalized groups without creating
disproportionate risks and harms, including risks related to privacy and human rights.
### Discussion of Biases
The dataset reflects many biases present in legal decision-making, including biases based on race, immigration status, gender, sexual orientation, religion, disability, socio-economic class, and other intersecting categories of discrimination.
### Other Known Limitations
Due to the ways that all
legal datasets may be skewed, users of this dataset are encouraged to collaborate with or consult domain experts.
## Additional Information
### Licensing Information
Attribution-NonCommercial 4.0 International ([CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/))
NOTE: Users must also comply with [upstream licensing](https://www.fct-cf.gc.ca/en/pages/important-notices) for data obtained from the Federal Court, as
well as requests on source urls not to allow indexing of the documents by search engines to protect privacy. As a result, users must
not make the data available in formats or locations that can be indexed by search engines.
### Warranties / Representations
We make no warranties or representations that the data included in this dataset is complete or accurate. Data
were obtained through academic research projects, including projects that use automated processes.
While we try to make the data as accurate as possible, our methodologies may result in
inaccurate or outdated data. As such, data should be viewed as preliminary information aimed to prompt
further research and discussion, rather than as definitive information.
### Dataset Curators
[Sean Rehaag](https://www.osgoode.yorku.ca/faculty-and-staff/rehaag-sean), Osgoode Hall Law School Professor & Director of the Refugee Law Lab
### Citation Information
Sean Rehaag, "Luck of the Draw III: Code & Data" (2023) online: Github: <https://github.com/Refugee-Law-Lab/luck-of-the-draw-iii>.
### Acknowledgements
This project draws on research supported by the Social Sciences and Humanities Research Council, the Law Foundation of Ontario, and the Digital Research Alliance of Canada. Jacob Danovich assisted with the infrastructure and scraping code for this project. | refugee-law-lab/luck-of-the-draw-iii | [
"size_categories:100K<n<1M",
"language:en",
"language:fr",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-01-24T01:51:12+00:00 | {"language": ["en", "fr"], "license": "cc-by-nc-4.0", "size_categories": ["100K<n<1M"]} | 2024-01-25T18:01:51+00:00 | [] | [
"en",
"fr"
] | TAGS
#size_categories-100K<n<1M #language-English #language-French #license-cc-by-nc-4.0 #region-us
| # Refugee Law Lab: Luck of the Draw III: Data
## Dataset Summary
The Refugee Law Lab supports bulk open-access to Canadian legal data to facilitate research and advocacy.
Bulk open-access helps avoid asymmetrical access-to-justice and amplification of marginalization that
results when commercial actors leverage proprietary
legal datasets for profit -- a particular concern in the border control setting.
This is the dataset used for a research project published in the Queen's Law Journal, undertaken at the Refugee Law Lab about outcomes in stays of removal in Canada's
Federal Court. Specifically, it includes information from the online Federal Court dockets for all immigration law cases filed between
1997 and 2022.
The dataset can be used for legal analytics (i.e. identifying patterns in legal
decision-making), to test ML and NLP tools on a bilingual dataset of Canadian legal materials, and to
pretrain language models for various tasks.
## Dataset Structure
### Data Instance
The datset includes a single data instance of all online Federal Court dockets involving immigration law filed between 1997 and 2022,
as they appeared when the data was gathered in November 2022.
### Data Fields
Data fields match the formart used for the Refugee Law Lab's Canadian Legal Data dataset.
- citation (string): Legal citation for the document (neutral citation where available). In this dataset, the legal citaiton is the docket number, which is a identifer for the file assigned by the Federal Court. Docket numbers take the form IMM-#-YY. IMM signals that this is an immigration law docket, # is a sequential number starting at 1 that represents the order in which applications were received in a given year, and YY is the last two digits of the year in which the application was initially filed.
- year (int32): Year of the document date, which can be useful for filtering. For this dataset, the year is the year when the application was initially filed.
- name (string): Name of the document, in this dataset the style of cause of a cour file
- date_filed (string): Date of the document (yyyy-mm-dd). In this dataset the year is the date the application was filed.
- city_filed (string): City where the application was initially filed
- nature (string): A category of proceedings assigned by the Federal Court
- class (string): A second category of proceedings assigned by the Federal court
- track (string): A third category of proceedings assigned by the Federal Court
- documents (list of dictionaries): A list of dictionaries containing each docket entry (or row in the table of docket entries in a docket). Each dictionary contains the following key/value pairs:
* RE_NO: The number assigned to the docket entry by the Federal Court
* DOCNO: Where the entry involves the filing of a document, the number assigned to that document by the Federal Court
* DOC_DT: The date of the docket entry
* RECORDED_ENTRY: The content of the docket entry
- source_url (string): URL where the document was scraped and where the official version can be found
- scraped_timestamp (string): Date the document was scraped (yyyy-mm-dd)
### Data Languages
Some dockets are in English, some in French, and some alternate between English and French
### Data Splits
The data has not been split, so all data is in the train split.
### Data Loading
To load the data:
To convert to dataframe:
## Dataset Creation
### Curation Rationale
The dataset includes all Federal Court immigration law dockets available on the Federal Court's website at the time of research (November 2022). The Refugee Law Lab gathered this data for several projects, including the Refugee Law Lab Portal and the research article on Federal Court stays linked above.
### Source Data
#### Source
All data was gathered via the Federal Court's website.
#### Initial Data Collection and Normalization
Details are available via links on the Refugee Law Lab's Github respository [Luck of the Draw III: Code & Data]
(URL
### Personal and Sensitive Information
Documents may include personal and sensitive information. All documents have been published online by the Federal Court. While the open court principle mandates
that court materials be made available to the public, there are privacy risks when these
materials become easily and widely available. These privacy risks are particularly acute for marginalized groups,
including refugees and other non-citizens whose personal and sensitive information is included in some of the
documents in this dataset. For example, imagine a repressive government working with private data aggregators to
collect information that is used to target families of political opponents who have sought asylum abroad.
One mechanism used to try to achieve a balance between the open court principle
and privacy is that in publishing the documents in this dataset, the relevant courts and tribunals prohibit
search engines from indexing the documents. Users of this data are required to do the same.
### Non-Official Versions
Documents included in this dataset are unofficial copies. For official versions published by
the Government of Canada, please see the source URLs.
### Non-Affiliation / Endorsement
The reproduction of documents in this dataset was not done in affiliation with, or with the endorsement of
the Federal Court or the Government of Canada.
## Considerations for Using the Data
### Social Impact of Dataset
The Refugee Law Lab recognizes that this dataset -- and further research using the dataset -- raises challenging
questions about how to balance protecting privacy, enhancing government transparency, addressing information
asymmetries, and building technologies that leverage data to advance the rights and interests of
refugees and other displaced people, as well as assisting those working with them (rather than technologies that
enhance the power of states
to control the movement of people across borders).
More broadly, the Refugee Law Lab also recognizes that considerations around privacy and data protection are complex
and evolving. When working on migration, refugee law, data, technology and surveillance, we strive to foreground
intersectional understandings of the systemic harms perpetuated against groups historically made marginalized. We
encourage other users to do the same.
We also encourage users to try to avoid participating in building technologies that harm refugees and other
marginalized groups, as well as to connect with community organizations
working in this space, and to listen directly and learn from people who are affected by new technologies.
We will review the use these datasets periodically to examine whether continuing to publicly release these datasets achieves
the Refugee Law Lab's goals of advancing the rights and interests of refugees and other marginalized groups without creating
disproportionate risks and harms, including risks related to privacy and human rights.
### Discussion of Biases
The dataset reflects many biases present in legal decision-making, including biases based on race, immigration status, gender, sexual orientation, religion, disability, socio-economic class, and other intersecting categories of discrimination.
### Other Known Limitations
Due to the ways that all
legal datasets may be skewed, users of this dataset are encouraged to collaborate with or consult domain experts.
## Additional Information
### Licensing Information
Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)
NOTE: Users must also comply with upstream licensing for data obtained from the Federal Court, as
well as requests on source urls not to allow indexing of the documents by search engines to protect privacy. As a result, users must
not make the data available in formats or locations that can be indexed by search engines.
### Warranties / Representations
We make no warranties or representations that the data included in this dataset is complete or accurate. Data
were obtained through academic research projects, including projects that use automated processes.
While we try to make the data as accurate as possible, our methodologies may result in
inaccurate or outdated data. As such, data should be viewed as preliminary information aimed to prompt
further research and discussion, rather than as definitive information.
### Dataset Curators
Sean Rehaag, Osgoode Hall Law School Professor & Director of the Refugee Law Lab
Sean Rehaag, "Luck of the Draw III: Code & Data" (2023) online: Github: <URL
### Acknowledgements
This project draws on research supported by the Social Sciences and Humanities Research Council, the Law Foundation of Ontario, and the Digital Research Alliance of Canada. Jacob Danovich assisted with the infrastructure and scraping code for this project. | [
"# Refugee Law Lab: Luck of the Draw III: Data",
"## Dataset Summary\n\nThe Refugee Law Lab supports bulk open-access to Canadian legal data to facilitate research and advocacy. \nBulk open-access helps avoid asymmetrical access-to-justice and amplification of marginalization that \nresults when commercial actors leverage proprietary \nlegal datasets for profit -- a particular concern in the border control setting.\n\nThis is the dataset used for a research project published in the Queen's Law Journal, undertaken at the Refugee Law Lab about outcomes in stays of removal in Canada's\nFederal Court. Specifically, it includes information from the online Federal Court dockets for all immigration law cases filed between\n1997 and 2022.\n\nThe dataset can be used for legal analytics (i.e. identifying patterns in legal \ndecision-making), to test ML and NLP tools on a bilingual dataset of Canadian legal materials, and to \npretrain language models for various tasks.",
"## Dataset Structure",
"### Data Instance\n\nThe datset includes a single data instance of all online Federal Court dockets involving immigration law filed between 1997 and 2022, \nas they appeared when the data was gathered in November 2022.",
"### Data Fields\n\nData fields match the formart used for the Refugee Law Lab's Canadian Legal Data dataset.\n\n- citation (string): Legal citation for the document (neutral citation where available). In this dataset, the legal citaiton is the docket number, which is a identifer for the file assigned by the Federal Court. Docket numbers take the form IMM-#-YY. IMM signals that this is an immigration law docket, # is a sequential number starting at 1 that represents the order in which applications were received in a given year, and YY is the last two digits of the year in which the application was initially filed.\n\n- year (int32): Year of the document date, which can be useful for filtering. For this dataset, the year is the year when the application was initially filed.\n\n- name (string): Name of the document, in this dataset the style of cause of a cour file\n\n- date_filed (string): Date of the document (yyyy-mm-dd). In this dataset the year is the date the application was filed.\n\n- city_filed (string): City where the application was initially filed\n\n- nature (string): A category of proceedings assigned by the Federal Court\n\n- class (string): A second category of proceedings assigned by the Federal court\n\n- track (string): A third category of proceedings assigned by the Federal Court\n\n- documents (list of dictionaries): A list of dictionaries containing each docket entry (or row in the table of docket entries in a docket). Each dictionary contains the following key/value pairs:\n\n * RE_NO: The number assigned to the docket entry by the Federal Court\n \n * DOCNO: Where the entry involves the filing of a document, the number assigned to that document by the Federal Court\n \n * DOC_DT: The date of the docket entry\n\n * RECORDED_ENTRY: The content of the docket entry\n\n- source_url (string): URL where the document was scraped and where the official version can be found\n\n- scraped_timestamp (string): Date the document was scraped (yyyy-mm-dd)",
"### Data Languages\n\nSome dockets are in English, some in French, and some alternate between English and French",
"### Data Splits\n\nThe data has not been split, so all data is in the train split.",
"### Data Loading\n\nTo load the data:\n\n\n\nTo convert to dataframe:",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset includes all Federal Court immigration law dockets available on the Federal Court's website at the time of research (November 2022). The Refugee Law Lab gathered this data for several projects, including the Refugee Law Lab Portal and the research article on Federal Court stays linked above.",
"### Source Data",
"#### Source\n\nAll data was gathered via the Federal Court's website.",
"#### Initial Data Collection and Normalization\n\nDetails are available via links on the Refugee Law Lab's Github respository [Luck of the Draw III: Code & Data]\n(URL",
"### Personal and Sensitive Information\n\nDocuments may include personal and sensitive information. All documents have been published online by the Federal Court. While the open court principle mandates \nthat court materials be made available to the public, there are privacy risks when these \nmaterials become easily and widely available. These privacy risks are particularly acute for marginalized groups, \nincluding refugees and other non-citizens whose personal and sensitive information is included in some of the\ndocuments in this dataset. For example, imagine a repressive government working with private data aggregators to \ncollect information that is used to target families of political opponents who have sought asylum abroad.\nOne mechanism used to try to achieve a balance between the open court principle \nand privacy is that in publishing the documents in this dataset, the relevant courts and tribunals prohibit \nsearch engines from indexing the documents. Users of this data are required to do the same.",
"### Non-Official Versions\n\nDocuments included in this dataset are unofficial copies. For official versions published by \nthe Government of Canada, please see the source URLs.",
"### Non-Affiliation / Endorsement\n\nThe reproduction of documents in this dataset was not done in affiliation with, or with the endorsement of \nthe Federal Court or the Government of Canada.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe Refugee Law Lab recognizes that this dataset -- and further research using the dataset -- raises challenging \nquestions about how to balance protecting privacy, enhancing government transparency, addressing information \nasymmetries, and building technologies that leverage data to advance the rights and interests of \nrefugees and other displaced people, as well as assisting those working with them (rather than technologies that \nenhance the power of states \nto control the movement of people across borders).\n\nMore broadly, the Refugee Law Lab also recognizes that considerations around privacy and data protection are complex \nand evolving. When working on migration, refugee law, data, technology and surveillance, we strive to foreground \nintersectional understandings of the systemic harms perpetuated against groups historically made marginalized. We \nencourage other users to do the same.\n\nWe also encourage users to try to avoid participating in building technologies that harm refugees and other \nmarginalized groups, as well as to connect with community organizations \nworking in this space, and to listen directly and learn from people who are affected by new technologies. \n\nWe will review the use these datasets periodically to examine whether continuing to publicly release these datasets achieves \nthe Refugee Law Lab's goals of advancing the rights and interests of refugees and other marginalized groups without creating \ndisproportionate risks and harms, including risks related to privacy and human rights.",
"### Discussion of Biases\n\nThe dataset reflects many biases present in legal decision-making, including biases based on race, immigration status, gender, sexual orientation, religion, disability, socio-economic class, and other intersecting categories of discrimination.",
"### Other Known Limitations\n\nDue to the ways that all\nlegal datasets may be skewed, users of this dataset are encouraged to collaborate with or consult domain experts.",
"## Additional Information",
"### Licensing Information\n\nAttribution-NonCommercial 4.0 International (CC BY-NC 4.0)\n\nNOTE: Users must also comply with upstream licensing for data obtained from the Federal Court, as \nwell as requests on source urls not to allow indexing of the documents by search engines to protect privacy. As a result, users must \nnot make the data available in formats or locations that can be indexed by search engines.",
"### Warranties / Representations\n\nWe make no warranties or representations that the data included in this dataset is complete or accurate. Data \nwere obtained through academic research projects, including projects that use automated processes. \nWhile we try to make the data as accurate as possible, our methodologies may result in \ninaccurate or outdated data. As such, data should be viewed as preliminary information aimed to prompt \nfurther research and discussion, rather than as definitive information.",
"### Dataset Curators\n\nSean Rehaag, Osgoode Hall Law School Professor & Director of the Refugee Law Lab\n\n\n\nSean Rehaag, \"Luck of the Draw III: Code & Data\" (2023) online: Github: <URL",
"### Acknowledgements\n\nThis project draws on research supported by the Social Sciences and Humanities Research Council, the Law Foundation of Ontario, and the Digital Research Alliance of Canada. Jacob Danovich assisted with the infrastructure and scraping code for this project."
] | [
"TAGS\n#size_categories-100K<n<1M #language-English #language-French #license-cc-by-nc-4.0 #region-us \n",
"# Refugee Law Lab: Luck of the Draw III: Data",
"## Dataset Summary\n\nThe Refugee Law Lab supports bulk open-access to Canadian legal data to facilitate research and advocacy. \nBulk open-access helps avoid asymmetrical access-to-justice and amplification of marginalization that \nresults when commercial actors leverage proprietary \nlegal datasets for profit -- a particular concern in the border control setting.\n\nThis is the dataset used for a research project published in the Queen's Law Journal, undertaken at the Refugee Law Lab about outcomes in stays of removal in Canada's\nFederal Court. Specifically, it includes information from the online Federal Court dockets for all immigration law cases filed between\n1997 and 2022.\n\nThe dataset can be used for legal analytics (i.e. identifying patterns in legal \ndecision-making), to test ML and NLP tools on a bilingual dataset of Canadian legal materials, and to \npretrain language models for various tasks.",
"## Dataset Structure",
"### Data Instance\n\nThe datset includes a single data instance of all online Federal Court dockets involving immigration law filed between 1997 and 2022, \nas they appeared when the data was gathered in November 2022.",
"### Data Fields\n\nData fields match the formart used for the Refugee Law Lab's Canadian Legal Data dataset.\n\n- citation (string): Legal citation for the document (neutral citation where available). In this dataset, the legal citaiton is the docket number, which is a identifer for the file assigned by the Federal Court. Docket numbers take the form IMM-#-YY. IMM signals that this is an immigration law docket, # is a sequential number starting at 1 that represents the order in which applications were received in a given year, and YY is the last two digits of the year in which the application was initially filed.\n\n- year (int32): Year of the document date, which can be useful for filtering. For this dataset, the year is the year when the application was initially filed.\n\n- name (string): Name of the document, in this dataset the style of cause of a cour file\n\n- date_filed (string): Date of the document (yyyy-mm-dd). In this dataset the year is the date the application was filed.\n\n- city_filed (string): City where the application was initially filed\n\n- nature (string): A category of proceedings assigned by the Federal Court\n\n- class (string): A second category of proceedings assigned by the Federal court\n\n- track (string): A third category of proceedings assigned by the Federal Court\n\n- documents (list of dictionaries): A list of dictionaries containing each docket entry (or row in the table of docket entries in a docket). Each dictionary contains the following key/value pairs:\n\n * RE_NO: The number assigned to the docket entry by the Federal Court\n \n * DOCNO: Where the entry involves the filing of a document, the number assigned to that document by the Federal Court\n \n * DOC_DT: The date of the docket entry\n\n * RECORDED_ENTRY: The content of the docket entry\n\n- source_url (string): URL where the document was scraped and where the official version can be found\n\n- scraped_timestamp (string): Date the document was scraped (yyyy-mm-dd)",
"### Data Languages\n\nSome dockets are in English, some in French, and some alternate between English and French",
"### Data Splits\n\nThe data has not been split, so all data is in the train split.",
"### Data Loading\n\nTo load the data:\n\n\n\nTo convert to dataframe:",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset includes all Federal Court immigration law dockets available on the Federal Court's website at the time of research (November 2022). The Refugee Law Lab gathered this data for several projects, including the Refugee Law Lab Portal and the research article on Federal Court stays linked above.",
"### Source Data",
"#### Source\n\nAll data was gathered via the Federal Court's website.",
"#### Initial Data Collection and Normalization\n\nDetails are available via links on the Refugee Law Lab's Github respository [Luck of the Draw III: Code & Data]\n(URL",
"### Personal and Sensitive Information\n\nDocuments may include personal and sensitive information. All documents have been published online by the Federal Court. While the open court principle mandates \nthat court materials be made available to the public, there are privacy risks when these \nmaterials become easily and widely available. These privacy risks are particularly acute for marginalized groups, \nincluding refugees and other non-citizens whose personal and sensitive information is included in some of the\ndocuments in this dataset. For example, imagine a repressive government working with private data aggregators to \ncollect information that is used to target families of political opponents who have sought asylum abroad.\nOne mechanism used to try to achieve a balance between the open court principle \nand privacy is that in publishing the documents in this dataset, the relevant courts and tribunals prohibit \nsearch engines from indexing the documents. Users of this data are required to do the same.",
"### Non-Official Versions\n\nDocuments included in this dataset are unofficial copies. For official versions published by \nthe Government of Canada, please see the source URLs.",
"### Non-Affiliation / Endorsement\n\nThe reproduction of documents in this dataset was not done in affiliation with, or with the endorsement of \nthe Federal Court or the Government of Canada.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe Refugee Law Lab recognizes that this dataset -- and further research using the dataset -- raises challenging \nquestions about how to balance protecting privacy, enhancing government transparency, addressing information \nasymmetries, and building technologies that leverage data to advance the rights and interests of \nrefugees and other displaced people, as well as assisting those working with them (rather than technologies that \nenhance the power of states \nto control the movement of people across borders).\n\nMore broadly, the Refugee Law Lab also recognizes that considerations around privacy and data protection are complex \nand evolving. When working on migration, refugee law, data, technology and surveillance, we strive to foreground \nintersectional understandings of the systemic harms perpetuated against groups historically made marginalized. We \nencourage other users to do the same.\n\nWe also encourage users to try to avoid participating in building technologies that harm refugees and other \nmarginalized groups, as well as to connect with community organizations \nworking in this space, and to listen directly and learn from people who are affected by new technologies. \n\nWe will review the use these datasets periodically to examine whether continuing to publicly release these datasets achieves \nthe Refugee Law Lab's goals of advancing the rights and interests of refugees and other marginalized groups without creating \ndisproportionate risks and harms, including risks related to privacy and human rights.",
"### Discussion of Biases\n\nThe dataset reflects many biases present in legal decision-making, including biases based on race, immigration status, gender, sexual orientation, religion, disability, socio-economic class, and other intersecting categories of discrimination.",
"### Other Known Limitations\n\nDue to the ways that all\nlegal datasets may be skewed, users of this dataset are encouraged to collaborate with or consult domain experts.",
"## Additional Information",
"### Licensing Information\n\nAttribution-NonCommercial 4.0 International (CC BY-NC 4.0)\n\nNOTE: Users must also comply with upstream licensing for data obtained from the Federal Court, as \nwell as requests on source urls not to allow indexing of the documents by search engines to protect privacy. As a result, users must \nnot make the data available in formats or locations that can be indexed by search engines.",
"### Warranties / Representations\n\nWe make no warranties or representations that the data included in this dataset is complete or accurate. Data \nwere obtained through academic research projects, including projects that use automated processes. \nWhile we try to make the data as accurate as possible, our methodologies may result in \ninaccurate or outdated data. As such, data should be viewed as preliminary information aimed to prompt \nfurther research and discussion, rather than as definitive information.",
"### Dataset Curators\n\nSean Rehaag, Osgoode Hall Law School Professor & Director of the Refugee Law Lab\n\n\n\nSean Rehaag, \"Luck of the Draw III: Code & Data\" (2023) online: Github: <URL",
"### Acknowledgements\n\nThis project draws on research supported by the Social Sciences and Humanities Research Council, the Law Foundation of Ontario, and the Digital Research Alliance of Canada. Jacob Danovich assisted with the infrastructure and scraping code for this project."
] |
bc6341d3416794fef80b7d56d76d425f44c53533 |
dataset designed to peft fine-tune MHENNlitv3 to be more neutral and more competent in c# and unity, along with some additional calculus, algebra and stats samples in an attempt to improve MHENNs mathematical reasoning capabilities | netcat420/MHENN3.5 | [
"license:mit",
"region:us"
] | 2024-01-24T01:59:24+00:00 | {"license": "mit"} | 2024-01-24T02:07:06+00:00 | [] | [] | TAGS
#license-mit #region-us
|
dataset designed to peft fine-tune MHENNlitv3 to be more neutral and more competent in c# and unity, along with some additional calculus, algebra and stats samples in an attempt to improve MHENNs mathematical reasoning capabilities | [
"# and unity, along with some additional calculus, algebra and stats samples in an attempt to improve MHENNs mathematical reasoning capabilities"
] | [
"TAGS\n#license-mit #region-us \n",
"# and unity, along with some additional calculus, algebra and stats samples in an attempt to improve MHENNs mathematical reasoning capabilities"
] |
dada83e6b1e0a0facad18ebc46b273385d6ee4f3 |
# SJ-Donald/orca-dpo-pairs-ko
SJ-Donald/orca-dpo-pairs-ko is merged dataset from fllow datasets
## Datasets
* [mncai/orca_dpo_pairs_ko](https://huggingface.co/datasets/mncai/orca_dpo_pairs_ko)
* [Ja-ck/Orca-DPO-Pairs-KO](https://huggingface.co/datasets/Ja-ck/Orca-DPO-Pairs-KO)
* [We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs](https://huggingface.co/datasets/We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs)
Merge datasets from above and drop duplicates.
## How to use
```Python
from datasets import load_dataset
ds = load_dataset("SJ-Donald/orca-dpo-pairs-ko")
print(ds)
DatasetDict({
train: Dataset({
features: ['system', 'question', 'chosen', 'rejected'],
num_rows: 36009
})
})
``` | SJ-Donald/orca-dpo-pairs-ko | [
"license:apache-2.0",
"orca-pairs",
"mncai/orca_dpo_pairs_ko",
"Ja-ck/Orca-DPO-Pairs-KO",
"We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs",
"region:us"
] | 2024-01-24T02:24:36+00:00 | {"license": "apache-2.0", "tags": ["orca-pairs", "mncai/orca_dpo_pairs_ko", "Ja-ck/Orca-DPO-Pairs-KO", "We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs"]} | 2024-01-24T02:27:21+00:00 | [] | [] | TAGS
#license-apache-2.0 #orca-pairs #mncai/orca_dpo_pairs_ko #Ja-ck/Orca-DPO-Pairs-KO #We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #region-us
|
# SJ-Donald/orca-dpo-pairs-ko
SJ-Donald/orca-dpo-pairs-ko is merged dataset from fllow datasets
## Datasets
* mncai/orca_dpo_pairs_ko
* Ja-ck/Orca-DPO-Pairs-KO
* We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs
Merge datasets from above and drop duplicates.
## How to use
| [
"# SJ-Donald/orca-dpo-pairs-ko\n\nSJ-Donald/orca-dpo-pairs-ko is merged dataset from fllow datasets",
"## Datasets\n\n* mncai/orca_dpo_pairs_ko\n* Ja-ck/Orca-DPO-Pairs-KO\n* We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs\n\nMerge datasets from above and drop duplicates.",
"## How to use"
] | [
"TAGS\n#license-apache-2.0 #orca-pairs #mncai/orca_dpo_pairs_ko #Ja-ck/Orca-DPO-Pairs-KO #We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs #region-us \n",
"# SJ-Donald/orca-dpo-pairs-ko\n\nSJ-Donald/orca-dpo-pairs-ko is merged dataset from fllow datasets",
"## Datasets\n\n* mncai/orca_dpo_pairs_ko\n* Ja-ck/Orca-DPO-Pairs-KO\n* We-Want-GPU/Yi-Ko-DPO-Orca-DPO-Pairs\n\nMerge datasets from above and drop duplicates.",
"## How to use"
] |
29676e11f79a88d089c8fc783169c91f37201076 | # Wikipedia-Malaysian-Politicians multiturn
Original dataset at https://huggingface.co/datasets/Englios/Wikipedia-Malaysian-Politicians, we just translate and prepare multi-turn chat template. | malaysia-ai/Wikipedia-Malaysian-Politicians-multiturn | [
"language:ms",
"region:us"
] | 2024-01-24T03:44:17+00:00 | {"language": ["ms"]} | 2024-01-24T03:45:28+00:00 | [] | [
"ms"
] | TAGS
#language-Malay (macrolanguage) #region-us
| # Wikipedia-Malaysian-Politicians multiturn
Original dataset at URL we just translate and prepare multi-turn chat template. | [
"# Wikipedia-Malaysian-Politicians multiturn\n\nOriginal dataset at URL we just translate and prepare multi-turn chat template."
] | [
"TAGS\n#language-Malay (macrolanguage) #region-us \n",
"# Wikipedia-Malaysian-Politicians multiturn\n\nOriginal dataset at URL we just translate and prepare multi-turn chat template."
] |
77cf9c09181450e496e7e122a75f63cbc79d3583 | # wikipedia-malaysian-road-sign-images multiturn
Original dataset at https://huggingface.co/datasets/wanadzhar913/wikipedia-malaysian-road-sign-images, we just prepare multi-turn chat template. | malaysia-ai/wikipedia-malaysian-road-sign-images-multiturn | [
"language:ms",
"region:us"
] | 2024-01-24T03:46:37+00:00 | {"language": ["ms"]} | 2024-01-24T03:47:11+00:00 | [] | [
"ms"
] | TAGS
#language-Malay (macrolanguage) #region-us
| # wikipedia-malaysian-road-sign-images multiturn
Original dataset at URL we just prepare multi-turn chat template. | [
"# wikipedia-malaysian-road-sign-images multiturn\n\nOriginal dataset at URL we just prepare multi-turn chat template."
] | [
"TAGS\n#language-Malay (macrolanguage) #region-us \n",
"# wikipedia-malaysian-road-sign-images multiturn\n\nOriginal dataset at URL we just prepare multi-turn chat template."
] |
cefac32eb8d75159f3ba98d87ffc4b971bddd09d |
# Dataset Card for Evaluation run of jsfs11/WildMBXMarconi-SLERP-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/WildMBXMarconi-SLERP-7B](https://huggingface.co/jsfs11/WildMBXMarconi-SLERP-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T03:57:13.465418](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B/blob/main/results_2024-01-24T03-57-13.465418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6550573661941113,
"acc_stderr": 0.03204593323100834,
"acc_norm": 0.6543869571104515,
"acc_norm_stderr": 0.032715471402145695,
"mc1": 0.5569155446756426,
"mc1_stderr": 0.017389730346877116,
"mc2": 0.6897987229376631,
"mc2_stderr": 0.015167779378758222
},
"harness|arc:challenge|25": {
"acc": 0.6996587030716723,
"acc_stderr": 0.013395909309957002,
"acc_norm": 0.7329351535836177,
"acc_norm_stderr": 0.012928933196496364
},
"harness|hellaswag|10": {
"acc": 0.7191794463254332,
"acc_stderr": 0.00448481564706465,
"acc_norm": 0.884883489344752,
"acc_norm_stderr": 0.0031851021916879108
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.01656382939904771,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.01656382939904771
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5569155446756426,
"mc1_stderr": 0.017389730346877116,
"mc2": 0.6897987229376631,
"mc2_stderr": 0.015167779378758222
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.7088703563305534,
"acc_stderr": 0.012513215297888463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B | [
"region:us"
] | 2024-01-24T03:59:37+00:00 | {"pretty_name": "Evaluation run of jsfs11/WildMBXMarconi-SLERP-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [jsfs11/WildMBXMarconi-SLERP-7B](https://huggingface.co/jsfs11/WildMBXMarconi-SLERP-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T03:57:13.465418](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__WildMBXMarconi-SLERP-7B/blob/main/results_2024-01-24T03-57-13.465418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550573661941113,\n \"acc_stderr\": 0.03204593323100834,\n \"acc_norm\": 0.6543869571104515,\n \"acc_norm_stderr\": 0.032715471402145695,\n \"mc1\": 0.5569155446756426,\n \"mc1_stderr\": 0.017389730346877116,\n \"mc2\": 0.6897987229376631,\n \"mc2_stderr\": 0.015167779378758222\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957002,\n \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.012928933196496364\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7191794463254332,\n \"acc_stderr\": 0.00448481564706465,\n \"acc_norm\": 0.884883489344752,\n \"acc_norm_stderr\": 0.0031851021916879108\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n \"acc_stderr\": 0.01656382939904771,\n \"acc_norm\": 0.4312849162011173,\n \"acc_norm_stderr\": 0.01656382939904771\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5569155446756426,\n \"mc1_stderr\": 0.017389730346877116,\n \"mc2\": 0.6897987229376631,\n \"mc2_stderr\": 0.015167779378758222\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \"acc_stderr\": 0.012513215297888463\n }\n}\n```", "repo_url": "https://huggingface.co/jsfs11/WildMBXMarconi-SLERP-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|arc:challenge|25_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|gsm8k|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hellaswag|10_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["**/details_harness|winogrande|5_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T03-57-13.465418.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T03_57_13.465418", "path": ["results_2024-01-24T03-57-13.465418.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T03-57-13.465418.parquet"]}]}]} | 2024-01-24T03:59:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jsfs11/WildMBXMarconi-SLERP-7B
Dataset automatically created during the evaluation run of model jsfs11/WildMBXMarconi-SLERP-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T03:57:13.465418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jsfs11/WildMBXMarconi-SLERP-7B\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/WildMBXMarconi-SLERP-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T03:57:13.465418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jsfs11/WildMBXMarconi-SLERP-7B\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/WildMBXMarconi-SLERP-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T03:57:13.465418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6bf0bd7b49f53d5469a38989beb851ac9dc9bac0 |
# Dataset Card for Evaluation run of PetroGPT/WestSeverus-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PetroGPT/WestSeverus-7B-DPO](https://huggingface.co/PetroGPT/WestSeverus-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PetroGPT__WestSeverus-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T04:03:24.441746](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__WestSeverus-7B-DPO/blob/main/results_2024-01-24T04-03-24.441746.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6548524099740454,
"acc_stderr": 0.03196835521217643,
"acc_norm": 0.6542270236664999,
"acc_norm_stderr": 0.03263896232037132,
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.7053363788221445,
"mc2_stderr": 0.014654405638711534
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518824,
"acc_norm": 0.7073378839590444,
"acc_norm_stderr": 0.013295916103619429
},
"harness|hellaswag|10": {
"acc": 0.6901015733917546,
"acc_stderr": 0.004615063817741863,
"acc_norm": 0.880103565026887,
"acc_norm_stderr": 0.003241765092912135
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.0235407993587233,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.0235407993587233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623227,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.7053363788221445,
"mc2_stderr": 0.014654405638711534
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.01043091746823742
},
"harness|gsm8k|5": {
"acc": 0.733131159969674,
"acc_stderr": 0.012183780551887952
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_PetroGPT__WestSeverus-7B-DPO | [
"region:us"
] | 2024-01-24T04:05:46+00:00 | {"pretty_name": "Evaluation run of PetroGPT/WestSeverus-7B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [PetroGPT/WestSeverus-7B-DPO](https://huggingface.co/PetroGPT/WestSeverus-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PetroGPT__WestSeverus-7B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T04:03:24.441746](https://huggingface.co/datasets/open-llm-leaderboard/details_PetroGPT__WestSeverus-7B-DPO/blob/main/results_2024-01-24T04-03-24.441746.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6548524099740454,\n \"acc_stderr\": 0.03196835521217643,\n \"acc_norm\": 0.6542270236664999,\n \"acc_norm_stderr\": 0.03263896232037132,\n \"mc1\": 0.5287637698898409,\n \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.7053363788221445,\n \"mc2_stderr\": 0.014654405638711534\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518824,\n \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619429\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6901015733917546,\n \"acc_stderr\": 0.004615063817741863,\n \"acc_norm\": 0.880103565026887,\n \"acc_norm_stderr\": 0.003241765092912135\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438655,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.0235407993587233,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.0235407993587233\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623227,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.7053363788221445,\n \"mc2_stderr\": 0.014654405638711534\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.01043091746823742\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.733131159969674,\n \"acc_stderr\": 0.012183780551887952\n }\n}\n```", "repo_url": "https://huggingface.co/PetroGPT/WestSeverus-7B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|arc:challenge|25_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|gsm8k|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hellaswag|10_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T04-03-24.441746.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["**/details_harness|winogrande|5_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T04-03-24.441746.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T04_03_24.441746", "path": ["results_2024-01-24T04-03-24.441746.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T04-03-24.441746.parquet"]}]}]} | 2024-01-24T04:06:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PetroGPT/WestSeverus-7B-DPO
Dataset automatically created during the evaluation run of model PetroGPT/WestSeverus-7B-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T04:03:24.441746(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of PetroGPT/WestSeverus-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model PetroGPT/WestSeverus-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T04:03:24.441746(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PetroGPT/WestSeverus-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model PetroGPT/WestSeverus-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T04:03:24.441746(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6c75abc7477ef1d7832cc356e9de41d09a8b153f |
# Dataset Card for Evaluation run of cris177/Orca-Hermes-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cris177/Orca-Hermes-7B-slerp](https://huggingface.co/cris177/Orca-Hermes-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cris177__Orca-Hermes-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T04:05:19.202804](https://huggingface.co/datasets/open-llm-leaderboard/details_cris177__Orca-Hermes-7B-slerp/blob/main/results_2024-01-24T04-05-19.202804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6366503993892096,
"acc_stderr": 0.03225341991875598,
"acc_norm": 0.6391914565848391,
"acc_norm_stderr": 0.032895103741050154,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5284492415147731,
"mc2_stderr": 0.01545047280078059
},
"harness|arc:challenge|25": {
"acc": 0.6186006825938567,
"acc_stderr": 0.014194389086685247,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839159
},
"harness|hellaswag|10": {
"acc": 0.6559450308703445,
"acc_stderr": 0.004740882120999966,
"acc_norm": 0.844353714399522,
"acc_norm_stderr": 0.003617787934747751
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062153,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3575418994413408,
"acc_stderr": 0.016029394474894886,
"acc_norm": 0.3575418994413408,
"acc_norm_stderr": 0.016029394474894886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162666,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162666
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5284492415147731,
"mc2_stderr": 0.01545047280078059
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643414
},
"harness|gsm8k|5": {
"acc": 0.5549658832448825,
"acc_stderr": 0.013689011567414198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cris177__Orca-Hermes-7B-slerp | [
"region:us"
] | 2024-01-24T04:07:37+00:00 | {"pretty_name": "Evaluation run of cris177/Orca-Hermes-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [cris177/Orca-Hermes-7B-slerp](https://huggingface.co/cris177/Orca-Hermes-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cris177__Orca-Hermes-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T04:05:19.202804](https://huggingface.co/datasets/open-llm-leaderboard/details_cris177__Orca-Hermes-7B-slerp/blob/main/results_2024-01-24T04-05-19.202804.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6366503993892096,\n \"acc_stderr\": 0.03225341991875598,\n \"acc_norm\": 0.6391914565848391,\n \"acc_norm_stderr\": 0.032895103741050154,\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5284492415147731,\n \"mc2_stderr\": 0.01545047280078059\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6186006825938567,\n \"acc_stderr\": 0.014194389086685247,\n \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839159\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6559450308703445,\n \"acc_stderr\": 0.004740882120999966,\n \"acc_norm\": 0.844353714399522,\n \"acc_norm_stderr\": 0.003617787934747751\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3575418994413408,\n \"acc_stderr\": 0.016029394474894886,\n \"acc_norm\": 0.3575418994413408,\n \"acc_norm_stderr\": 0.016029394474894886\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596729,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596729\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162666,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162666\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5284492415147731,\n \"mc2_stderr\": 0.01545047280078059\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643414\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5549658832448825,\n \"acc_stderr\": 0.013689011567414198\n }\n}\n```", "repo_url": "https://huggingface.co/cris177/Orca-Hermes-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|arc:challenge|25_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|gsm8k|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hellaswag|10_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T04-05-19.202804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["**/details_harness|winogrande|5_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T04-05-19.202804.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T04_05_19.202804", "path": ["results_2024-01-24T04-05-19.202804.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T04-05-19.202804.parquet"]}]}]} | 2024-01-24T04:08:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cris177/Orca-Hermes-7B-slerp
Dataset automatically created during the evaluation run of model cris177/Orca-Hermes-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T04:05:19.202804(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cris177/Orca-Hermes-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model cris177/Orca-Hermes-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T04:05:19.202804(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cris177/Orca-Hermes-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model cris177/Orca-Hermes-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T04:05:19.202804(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cbcd22a26c208b4c7dbc966f7d3b01b2b0032e15 | # Dataset Card for "counterfactual_babylm_keys_to_pipps_2913"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kanishka/counterfactual_babylm_keys_to_pipps_2913 | [
"region:us"
] | 2024-01-24T04:10:41+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 582987526, "num_examples": 11635530}, {"name": "validation", "num_bytes": 56120230, "num_examples": 1026747}], "download_size": 422376004, "dataset_size": 639107756}} | 2024-01-24T04:11:01+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "counterfactual_babylm_keys_to_pipps_2913"
More Information needed | [
"# Dataset Card for \"counterfactual_babylm_keys_to_pipps_2913\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"counterfactual_babylm_keys_to_pipps_2913\"\n\nMore Information needed"
] |
9141f1663d5354d6f7e59e222894899dfaa27035 | # Dataset Card for "counterfactual_babylm_keys_to_pipps_all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kanishka/counterfactual_babylm_keys_to_pipps_all | [
"region:us"
] | 2024-01-24T04:11:16+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 596284709, "num_examples": 11667448}, {"name": "validation", "num_bytes": 56120230, "num_examples": 1026747}], "download_size": 431052904, "dataset_size": 652404939}} | 2024-01-24T04:11:35+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "counterfactual_babylm_keys_to_pipps_all"
More Information needed | [
"# Dataset Card for \"counterfactual_babylm_keys_to_pipps_all\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"counterfactual_babylm_keys_to_pipps_all\"\n\nMore Information needed"
] |
22f8211df8d21344129e66d7da8de3796da1b6f2 | # Dataset Card for "cdnc_law_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | iamnguyen/cdnc_law_test | [
"region:us"
] | 2024-01-24T04:16:33+00:00 | {"dataset_info": {"features": [{"name": "citation", "dtype": "string"}, {"name": "content", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 138635, "num_examples": 100}], "download_size": 63379, "dataset_size": 138635}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-24T04:16:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cdnc_law_test"
More Information needed | [
"# Dataset Card for \"cdnc_law_test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cdnc_law_test\"\n\nMore Information needed"
] |
b0804cbc866bb75b9cbe8fcca89d9c1adfff12cd | # Dataset Card for "cdnc_law_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | iamnguyen/cdnc_law_eval | [
"region:us"
] | 2024-01-24T04:36:56+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "citation", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 153441, "num_examples": 100}], "download_size": 71332, "dataset_size": 153441}} | 2024-01-24T06:52:17+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cdnc_law_eval"
More Information needed | [
"# Dataset Card for \"cdnc_law_eval\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cdnc_law_eval\"\n\nMore Information needed"
] |
198d5ff8b108a3b65fb657e01f0c66ed35f7b820 |
# Dataset of d/D/D/D (Nikke: Goddess of Victory)
This is the dataset of d/D/D/D (Nikke: Goddess of Victory), containing 29 images and their tags.
The core tags of this character are `black_hair, bangs, red_eyes, blunt_bangs, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 43.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/d_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 23.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/d_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 78 | 49.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/d_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 37.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/d_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 78 | 71.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/d_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/d_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, from_behind, hood_up, looking_at_viewer, looking_back, solo, thighs, ass_focus, long_sleeves, black_gloves, cowboy_shot, indoors, lips, short_shorts, white_shorts, artist_name, belt, black_jacket, breasts, brown_hair, closed_mouth, cropped_jacket, hoodie, blush, cat_hood, crop_top, midriff, shiny, standing |
| 1 | 14 |  |  |  |  |  | hood_up, 1girl, black_gloves, looking_at_viewer, long_sleeves, solo, holding_weapon, closed_mouth, tactical_clothes, belt, gun, shorts, cape, jacket, thighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | from_behind | hood_up | looking_at_viewer | looking_back | solo | thighs | ass_focus | long_sleeves | black_gloves | cowboy_shot | indoors | lips | short_shorts | white_shorts | artist_name | belt | black_jacket | breasts | brown_hair | closed_mouth | cropped_jacket | hoodie | blush | cat_hood | crop_top | midriff | shiny | standing | holding_weapon | tactical_clothes | gun | shorts | cape | jacket | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:----------|:--------------------|:---------------|:-------|:---------|:------------|:---------------|:---------------|:--------------|:----------|:-------|:---------------|:---------------|:--------------|:-------|:---------------|:----------|:-------------|:---------------|:-----------------|:---------|:--------|:-----------|:-----------|:----------|:--------|:-----------|:-----------------|:-------------------|:------|:---------|:-------|:---------|:-------------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | | X | X | | X | X | | | | | | | X | | | | X | | | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/d_nikke | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-24T04:45:36+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-24T04:52:26+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of d/D/D/D (Nikke: Goddess of Victory)
==============================================
This is the dataset of d/D/D/D (Nikke: Goddess of Victory), containing 29 images and their tags.
The core tags of this character are 'black\_hair, bangs, red\_eyes, blunt\_bangs, short\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5c49c77babbad366c7e03cf75bd9e7d78d36eb13 | Dynamic Neural Architecture Optimization (DNAO) Through Adaptive Meta-Learning: Overview and Key Components
Background
Neural Architecture Search (NAS): NAS refers to the automated discovery of efficient neural network architectures for given tasks without extensive manual intervention *-^(Baker et al., 2016; Zoph & Le, 2018). It enables researchers and practitioners to find high-performing models tailored to specific challenges.
Meta-Learning: Also known as 'learning to learn', meta-learning accelerates the learning process of machine learning models by transferring knowledge between related tasks (*-^Schmidhuber, 1987; Thrun & Pratt, 1998; Schmidhuber, 2013).
Introducing DNAO
Dynamic Neural Architecture Optimization (DNAO) was initially proposed in Xie et al., 2020 and builds on the concepts of NAS and meta-learning. DNAO uses adaptive meta-learning to combine a self-evolving neural network architecture with a meta-learning component, enabling enhanced performance and reduced computational cost. Applications include image recognition, natural language processing, and speech recognition.
Key components
Self-evolving neural network architecture: Three approaches used within DNAO are Evolution Strategies (ES), Genetic Algorithms (GA), and Reinforcement Learning (RL). They allow for online adaptation of the neural network architecture according to changing problem conditions.
Evolution Strategies (ES): ES involves iteratively updating parameters using random mutations and evaluating fitness (*-^Back et al., 1997); Real et al., 2019)
Genetic Algorithms (GA): GA mimics biological evolution through crossover, mutation, and survival-of-the-fittest principles (*-^Goldberg, 1989; Deb et al., 2002)
Reinforcement Learning (RL): RL adjusts actions based on reward signals, gradually learning optimal policies (*-^Sutton & Barto, 1998)
Meta-learning component: Within DNAO, three prominent meta-learning techniques are employed: Model-agnostic Meta-Learning (MAML), Progressive Neural Architecture Search (PNAS), and One Shot Neural Architecture Search (OSNAS). Each technique facilitates rapid adaptation to new tasks while leveraging prior knowledge.
Model-agnostic Meta-Learning (MAML): A meta-learning algorithm designed for few-shot learning, allowing fast parameter updates when faced with new tasks (*-^Finn et al., 2017)
Progressive Neural Architecture Search (PNAS): Gradually grows child models by adding layers to parent models, retaining structural similarity among generations (*-^Chen et al., 2018)
One Shot Neural Architecture Search (OSNAS): Predicts entire neural architectures using one single sample, drastically reducing computation (*-^Brock et al., 2017)
Next, let us dive into the detailed implementation of DNAO.
Detailed Implementation of DNAO
Step 1: Initial Training
Begin by establishing a solid foundation through initial training of a base model. Perform multiple trials utilizing assorted tasks to foster comprehension regarding varying neural network architectures' efficacies across distinct domains. Collected data shall then inform the ensuing meta-learning processes.
Step 2: Data Collection and Preprocessing
Assemble ample datasets addressing disparate tasks such as image recognition, natural language processing, speech recognition, and time series analysis. Following acquisition, conduct necessary preparatory measures – namely, normalization, augmentation, and partitioning into designated subsets (training, validation, testing). Leverage proven tools like NumPy, Pandas, and Scikit-learn for seamless execution.
Step 3: Neural Network Architectures
Select suitable architectures corresponding to respective tasks. For instance, consider employing Convolutional Neural Networks (CNNs) for image recognition (e.g., VGG, ResNet) or Recurrent Neural Networks (RNNs) for time series analysis (e.g., LSTM, GRU). To facilitate development, capitalize on robust deep learning libraries like TensorFlow, PyTorch, or Keras, offering abundant prefabricated components for effortless creation and instruction.
Step 4: Training Loop Setup
Establish an organized training procedure incorporating essential elements such as data loading, model initialization, optimization algorithm selection, and assessment conducted via specified metrics (accuracy, loss, AUC). Make use of readily accessible interfaces provided by reputable libraries such as TensorFlow, PyTorch, or Keras.
Step 5: Model Storage
Preserve trained models in universally compatible formats (HDF5, JSON) for subsequent ease of accessibility throughout meta-learning phases. Employ proficient modules including h5py library and json package for secure stowage.
Subsequently, transition towards the crucial meta-learning aspect of DNAO.
Meta-Learning Phase
Part 1: Observer Pattern
Track the base model's progression amidst varied undertakings at differing levels of training maturation. Record pertinent indicators (precision, loss, elapsed time, resource allocation) to equip the meta-learner with exhaustive awareness concerning the base model's educational journey and efficiency.
Part 2: Developer Pattern
Construct and actualize the meta-learner by deploying established machine learning or deep learning algorithms. Selectively apply techniques like reinforcement learning, supervised learning, or unsupervised learning contingent upon prevailing data availability and objective expectations.
Part 3: Adaptive Architecture Generation
Capitalize on wisdom gleaned from the meta-learning excursions to engender specialized neural network structures harmonious with particular tasks or databases. Ensure fine-tuned precision alongside commendable operational efficiency, all whilst maintaining dynamic responsiveness toward evolving circumstances.
Substep 3.1: Architecture Exploration
Formulate a versatile strategy generating a spectrum of prospective neural network arrangements predicated upon dissimilar constituents and configuration schemes. Beneficial components comprise convolutional layers, pooling layers, recurrent layers, and others alike. Relish advanced functionalities offered by esteemed libraries like TensorFlow or PyTorch to streamline assembly operations.
Substep 3.2: Meta-Learner Integration
Interweave the gathered meta-learner expertise into the arrangement generation mechanism, thereby positioning oneself to objectively assess and preferentially advance viable candidates applicable to precise situations or collections. Engage distinguished machine learning models (Random Forest, Support Vector Machines) to carry out discriminating judgments.
Substep 3.3: Architecture Optimization
Refine handpicked layouts via sophisticated techniques involving gradient descent, genetic algorithms (DEAP), or Bayesian optimization. Ultimately, amplify their prowess in terms of both pinpoint accuracy and resource frugality.
Finally, culminate in the successful deployment of the meticulously crafted DNAO solution.
Model Deployment
Embody the perfected neural network structure into a formative AI scheme, competently tackling assigned objectives or database quandaries. Behold the remarkable benefits derived from the diligent endeavor put forth thus far.
To summarize, mastery over DNAO signifies triumphantly melding two powerful paradigms—neural architecture search and meta-learning—to yield a formidable force driving unequaled efficiency and precision within artificial intelligence landscapes. Immerse yourself in the intricate dance between these complementary disciplines and unlock boundless possibilities for innovation.
Should you require any clarification or auxiliary guidance, kindly do not hesitate to ask. Best wishes in your exploratory pursuit!
https://blog.salesforceairesearch.com/large-action-models/
https://arxiv.org/abs/2310.08560
https://machinelearningmastery.com/meta-learning-in-machine-learning/
https://arxiv.org/abs/1703.03400
https://www.turing.com/kb/genetic-algorithm-applications-in-ml
https://arxiv.org/abs/1712.00559
https://www.cuubstudio.com/blog/what-is-adaptive-architecture/
https://arxiv.org/abs/2104.00597
https://arxiv.org/abs/1904.00420
https://github.com/cg123/mergekit/tree/main?tab=readme-ov-file#merge-methods
https://lilianweng.github.io/posts/2019-09-05-evolution-strategies/#:~:text=Evolution%20Strategies%20(ES)%20is%20one,role%20in%20deep%20reinforcement%20learning.
| 222limin/dnao | [
"license:other",
"arxiv:2310.08560",
"arxiv:1703.03400",
"arxiv:1712.00559",
"arxiv:2104.00597",
"arxiv:1904.00420",
"region:us"
] | 2024-01-24T04:46:52+00:00 | {"license": "other", "license_name": "limin", "license_link": "LICENSE"} | 2024-01-24T17:51:02+00:00 | [
"2310.08560",
"1703.03400",
"1712.00559",
"2104.00597",
"1904.00420"
] | [] | TAGS
#license-other #arxiv-2310.08560 #arxiv-1703.03400 #arxiv-1712.00559 #arxiv-2104.00597 #arxiv-1904.00420 #region-us
| Dynamic Neural Architecture Optimization (DNAO) Through Adaptive Meta-Learning: Overview and Key Components
Background
Neural Architecture Search (NAS): NAS refers to the automated discovery of efficient neural network architectures for given tasks without extensive manual intervention *-^(Baker et al., 2016; Zoph & Le, 2018). It enables researchers and practitioners to find high-performing models tailored to specific challenges.
Meta-Learning: Also known as 'learning to learn', meta-learning accelerates the learning process of machine learning models by transferring knowledge between related tasks (*-^Schmidhuber, 1987; Thrun & Pratt, 1998; Schmidhuber, 2013).
Introducing DNAO
Dynamic Neural Architecture Optimization (DNAO) was initially proposed in Xie et al., 2020 and builds on the concepts of NAS and meta-learning. DNAO uses adaptive meta-learning to combine a self-evolving neural network architecture with a meta-learning component, enabling enhanced performance and reduced computational cost. Applications include image recognition, natural language processing, and speech recognition.
Key components
Self-evolving neural network architecture: Three approaches used within DNAO are Evolution Strategies (ES), Genetic Algorithms (GA), and Reinforcement Learning (RL). They allow for online adaptation of the neural network architecture according to changing problem conditions.
Evolution Strategies (ES): ES involves iteratively updating parameters using random mutations and evaluating fitness (*-^Back et al., 1997); Real et al., 2019)
Genetic Algorithms (GA): GA mimics biological evolution through crossover, mutation, and survival-of-the-fittest principles (*-^Goldberg, 1989; Deb et al., 2002)
Reinforcement Learning (RL): RL adjusts actions based on reward signals, gradually learning optimal policies (*-^Sutton & Barto, 1998)
Meta-learning component: Within DNAO, three prominent meta-learning techniques are employed: Model-agnostic Meta-Learning (MAML), Progressive Neural Architecture Search (PNAS), and One Shot Neural Architecture Search (OSNAS). Each technique facilitates rapid adaptation to new tasks while leveraging prior knowledge.
Model-agnostic Meta-Learning (MAML): A meta-learning algorithm designed for few-shot learning, allowing fast parameter updates when faced with new tasks (*-^Finn et al., 2017)
Progressive Neural Architecture Search (PNAS): Gradually grows child models by adding layers to parent models, retaining structural similarity among generations (*-^Chen et al., 2018)
One Shot Neural Architecture Search (OSNAS): Predicts entire neural architectures using one single sample, drastically reducing computation (*-^Brock et al., 2017)
Next, let us dive into the detailed implementation of DNAO.
Detailed Implementation of DNAO
Step 1: Initial Training
Begin by establishing a solid foundation through initial training of a base model. Perform multiple trials utilizing assorted tasks to foster comprehension regarding varying neural network architectures' efficacies across distinct domains. Collected data shall then inform the ensuing meta-learning processes.
Step 2: Data Collection and Preprocessing
Assemble ample datasets addressing disparate tasks such as image recognition, natural language processing, speech recognition, and time series analysis. Following acquisition, conduct necessary preparatory measures – namely, normalization, augmentation, and partitioning into designated subsets (training, validation, testing). Leverage proven tools like NumPy, Pandas, and Scikit-learn for seamless execution.
Step 3: Neural Network Architectures
Select suitable architectures corresponding to respective tasks. For instance, consider employing Convolutional Neural Networks (CNNs) for image recognition (e.g., VGG, ResNet) or Recurrent Neural Networks (RNNs) for time series analysis (e.g., LSTM, GRU). To facilitate development, capitalize on robust deep learning libraries like TensorFlow, PyTorch, or Keras, offering abundant prefabricated components for effortless creation and instruction.
Step 4: Training Loop Setup
Establish an organized training procedure incorporating essential elements such as data loading, model initialization, optimization algorithm selection, and assessment conducted via specified metrics (accuracy, loss, AUC). Make use of readily accessible interfaces provided by reputable libraries such as TensorFlow, PyTorch, or Keras.
Step 5: Model Storage
Preserve trained models in universally compatible formats (HDF5, JSON) for subsequent ease of accessibility throughout meta-learning phases. Employ proficient modules including h5py library and json package for secure stowage.
Subsequently, transition towards the crucial meta-learning aspect of DNAO.
Meta-Learning Phase
Part 1: Observer Pattern
Track the base model's progression amidst varied undertakings at differing levels of training maturation. Record pertinent indicators (precision, loss, elapsed time, resource allocation) to equip the meta-learner with exhaustive awareness concerning the base model's educational journey and efficiency.
Part 2: Developer Pattern
Construct and actualize the meta-learner by deploying established machine learning or deep learning algorithms. Selectively apply techniques like reinforcement learning, supervised learning, or unsupervised learning contingent upon prevailing data availability and objective expectations.
Part 3: Adaptive Architecture Generation
Capitalize on wisdom gleaned from the meta-learning excursions to engender specialized neural network structures harmonious with particular tasks or databases. Ensure fine-tuned precision alongside commendable operational efficiency, all whilst maintaining dynamic responsiveness toward evolving circumstances.
Substep 3.1: Architecture Exploration
Formulate a versatile strategy generating a spectrum of prospective neural network arrangements predicated upon dissimilar constituents and configuration schemes. Beneficial components comprise convolutional layers, pooling layers, recurrent layers, and others alike. Relish advanced functionalities offered by esteemed libraries like TensorFlow or PyTorch to streamline assembly operations.
Substep 3.2: Meta-Learner Integration
Interweave the gathered meta-learner expertise into the arrangement generation mechanism, thereby positioning oneself to objectively assess and preferentially advance viable candidates applicable to precise situations or collections. Engage distinguished machine learning models (Random Forest, Support Vector Machines) to carry out discriminating judgments.
Substep 3.3: Architecture Optimization
Refine handpicked layouts via sophisticated techniques involving gradient descent, genetic algorithms (DEAP), or Bayesian optimization. Ultimately, amplify their prowess in terms of both pinpoint accuracy and resource frugality.
Finally, culminate in the successful deployment of the meticulously crafted DNAO solution.
Model Deployment
Embody the perfected neural network structure into a formative AI scheme, competently tackling assigned objectives or database quandaries. Behold the remarkable benefits derived from the diligent endeavor put forth thus far.
To summarize, mastery over DNAO signifies triumphantly melding two powerful paradigms—neural architecture search and meta-learning—to yield a formidable force driving unequaled efficiency and precision within artificial intelligence landscapes. Immerse yourself in the intricate dance between these complementary disciplines and unlock boundless possibilities for innovation.
Should you require any clarification or auxiliary guidance, kindly do not hesitate to ask. Best wishes in your exploratory pursuit!
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
| [] | [
"TAGS\n#license-other #arxiv-2310.08560 #arxiv-1703.03400 #arxiv-1712.00559 #arxiv-2104.00597 #arxiv-1904.00420 #region-us \n"
] |
b83ddb1dd7c4f17fcf1bbbfb6f02a2dce389c0b0 |
# Dataset of red_hood/レッドフード/小红帽/레드 후드 (Nikke: Goddess of Victory)
This is the dataset of red_hood/レッドフード/小红帽/레드 후드 (Nikke: Goddess of Victory), containing 139 images and their tags.
The core tags of this character are `long_hair, red_hair, breasts, large_breasts, hair_between_eyes, bangs, yellow_eyes, horns, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 139 | 319.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/red_hood_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 139 | 136.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/red_hood_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 374 | 307.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/red_hood_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 139 | 257.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/red_hood_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 374 | 500.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/red_hood_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/red_hood_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, cleavage, black_gloves, looking_at_viewer, navel, pants, smile, fingerless_gloves, holding_gun, rifle, belt, midriff, red_jacket, red_scarf, open_jacket, zipper |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | black_gloves | looking_at_viewer | navel | pants | smile | fingerless_gloves | holding_gun | rifle | belt | midriff | red_jacket | red_scarf | open_jacket | zipper |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:---------------|:--------------------|:--------|:--------|:--------|:--------------------|:--------------|:--------|:-------|:----------|:-------------|:------------|:--------------|:---------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/red_hood_nikke | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-24T05:26:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-24T06:06:13+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of red\_hood/レッドフード/小红帽/레드 후드 (Nikke: Goddess of Victory)
=================================================================
This is the dataset of red\_hood/レッドフード/小红帽/레드 후드 (Nikke: Goddess of Victory), containing 139 images and their tags.
The core tags of this character are 'long\_hair, red\_hair, breasts, large\_breasts, hair\_between\_eyes, bangs, yellow\_eyes, horns, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
672363b6e6cb8427683d3ecbaef92901a7d53297 |
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-v2.5](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T05:25:23.992452](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5/blob/main/results_2024-01-24T05-25-23.992452.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23183324101327804,
"acc_stderr": 0.029931781472101626,
"acc_norm": 0.23247087577232647,
"acc_norm_stderr": 0.030724244636163932,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.4671579728496129,
"mc2_stderr": 0.015912807716045203
},
"harness|arc:challenge|25": {
"acc": 0.19965870307167236,
"acc_stderr": 0.011681625756888676,
"acc_norm": 0.24573378839590443,
"acc_norm_stderr": 0.012581033453730107
},
"harness|hellaswag|10": {
"acc": 0.2687711611232822,
"acc_stderr": 0.004424146562746121,
"acc_norm": 0.2749452300338578,
"acc_norm_stderr": 0.004455741817861901
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310053,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310053
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899095,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928724,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928724
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.027719315709614802,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.027719315709614802
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.15656565656565657,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.15656565656565657,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17218543046357615,
"acc_stderr": 0.030826136961962396,
"acc_norm": 0.17218543046357615,
"acc_norm_stderr": 0.030826136961962396
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1871559633027523,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.1871559633027523,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1388888888888889,
"acc_stderr": 0.02358544736890014,
"acc_norm": 0.1388888888888889,
"acc_norm_stderr": 0.02358544736890014
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173106,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173106
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.01516202415227844,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.01516202415227844
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.02226819625878323,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.02226819625878323
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02525786135943242,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02525786135943242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02388688192244034,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02388688192244034
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457921,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457921
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.4671579728496129,
"mc2_stderr": 0.015912807716045203
},
"harness|winogrande|5": {
"acc": 0.47829518547750594,
"acc_stderr": 0.01403923921648463
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5 | [
"region:us"
] | 2024-01-24T05:27:40+00:00 | {"pretty_name": "Evaluation run of Locutusque/TinyMistral-248M-v2.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/TinyMistral-248M-v2.5](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T05:25:23.992452](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248M-v2.5/blob/main/results_2024-01-24T05-25-23.992452.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23183324101327804,\n \"acc_stderr\": 0.029931781472101626,\n \"acc_norm\": 0.23247087577232647,\n \"acc_norm_stderr\": 0.030724244636163932,\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156496,\n \"mc2\": 0.4671579728496129,\n \"mc2_stderr\": 0.015912807716045203\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19965870307167236,\n \"acc_stderr\": 0.011681625756888676,\n \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730107\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2687711611232822,\n \"acc_stderr\": 0.004424146562746121,\n \"acc_norm\": 0.2749452300338578,\n \"acc_norm_stderr\": 0.004455741817861901\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310053,\n \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310053\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899095,\n \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899095\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.039325376803928724,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.039325376803928724\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n \"acc_stderr\": 0.022037217340267833,\n \"acc_norm\": 0.18387096774193548,\n \"acc_norm_stderr\": 0.022037217340267833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.027719315709614802,\n \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.027719315709614802\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.15656565656565657,\n \"acc_stderr\": 0.025890520358141454,\n \"acc_norm\": 0.15656565656565657,\n \"acc_norm_stderr\": 0.025890520358141454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246797,\n \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246797\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.17218543046357615,\n \"acc_stderr\": 0.030826136961962396,\n \"acc_norm\": 0.17218543046357615,\n \"acc_norm_stderr\": 0.030826136961962396\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1871559633027523,\n \"acc_stderr\": 0.016722684526200154,\n \"acc_norm\": 0.1871559633027523,\n \"acc_norm_stderr\": 0.016722684526200154\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1388888888888889,\n \"acc_stderr\": 0.02358544736890014,\n \"acc_norm\": 0.1388888888888889,\n \"acc_norm_stderr\": 0.02358544736890014\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.030236389942173106,\n \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.030236389942173106\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n \"acc_stderr\": 0.01516202415227844,\n \"acc_norm\": 0.23499361430395913,\n \"acc_norm_stderr\": 0.01516202415227844\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n \"acc_stderr\": 0.02226819625878323,\n \"acc_norm\": 0.18971061093247588,\n \"acc_norm_stderr\": 0.02226819625878323\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451156,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02525786135943242,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02525786135943242\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244034,\n \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244034\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457921,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457921\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156496,\n \"mc2\": 0.4671579728496129,\n \"mc2_stderr\": 0.015912807716045203\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.47829518547750594,\n \"acc_stderr\": 0.01403923921648463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/TinyMistral-248M-v2.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|arc:challenge|25_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|gsm8k|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hellaswag|10_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["**/details_harness|winogrande|5_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T05-25-23.992452.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T05_25_23.992452", "path": ["results_2024-01-24T05-25-23.992452.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T05-25-23.992452.parquet"]}]}]} | 2024-01-24T05:28:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2.5
Dataset automatically created during the evaluation run of model Locutusque/TinyMistral-248M-v2.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T05:25:23.992452(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2.5\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248M-v2.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T05:25:23.992452(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Locutusque/TinyMistral-248M-v2.5\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/TinyMistral-248M-v2.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T05:25:23.992452(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1083e2bc153492bce2dc2c867ebdb5872374c57c |
# Dataset Card for Evaluation run of BarryFutureman/ChatMarc-YesAnotherMerge-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/ChatMarc-YesAnotherMerge-7B](https://huggingface.co/BarryFutureman/ChatMarc-YesAnotherMerge-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T05:34:18.479696](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B/blob/main/results_2024-01-24T05-34-18.479696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6560146863596611,
"acc_stderr": 0.03206476346441959,
"acc_norm": 0.6553348227714214,
"acc_norm_stderr": 0.03273507303552595,
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404406,
"mc2": 0.700398174715358,
"mc2_stderr": 0.015160702701664436
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244484,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7228639713204541,
"acc_stderr": 0.004466695023677836,
"acc_norm": 0.8838876717785302,
"acc_norm_stderr": 0.003197048476003638
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.016574027219517635,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.016574027219517635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404406,
"mc2": 0.700398174715358,
"mc2_stderr": 0.015160702701664436
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785722
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.012625423152283034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B | [
"region:us"
] | 2024-01-24T05:36:35+00:00 | {"pretty_name": "Evaluation run of BarryFutureman/ChatMarc-YesAnotherMerge-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarryFutureman/ChatMarc-YesAnotherMerge-7B](https://huggingface.co/BarryFutureman/ChatMarc-YesAnotherMerge-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-24T05:34:18.479696](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B/blob/main/results_2024-01-24T05-34-18.479696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6560146863596611,\n \"acc_stderr\": 0.03206476346441959,\n \"acc_norm\": 0.6553348227714214,\n \"acc_norm_stderr\": 0.03273507303552595,\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.017369236164404406,\n \"mc2\": 0.700398174715358,\n \"mc2_stderr\": 0.015160702701664436\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244484,\n \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7228639713204541,\n \"acc_stderr\": 0.004466695023677836,\n \"acc_norm\": 0.8838876717785302,\n \"acc_norm_stderr\": 0.003197048476003638\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n \"mc1_stderr\": 0.017369236164404406,\n \"mc2\": 0.700398174715358,\n \"mc2_stderr\": 0.015160702701664436\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \"acc_stderr\": 0.012625423152283034\n }\n}\n```", "repo_url": "https://huggingface.co/BarryFutureman/ChatMarc-YesAnotherMerge-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|arc:challenge|25_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|gsm8k|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hellaswag|10_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["**/details_harness|winogrande|5_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-24T05-34-18.479696.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_24T05_34_18.479696", "path": ["results_2024-01-24T05-34-18.479696.parquet"]}, {"split": "latest", "path": ["results_2024-01-24T05-34-18.479696.parquet"]}]}]} | 2024-01-24T05:36:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarryFutureman/ChatMarc-YesAnotherMerge-7B
Dataset automatically created during the evaluation run of model BarryFutureman/ChatMarc-YesAnotherMerge-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-24T05:34:18.479696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarryFutureman/ChatMarc-YesAnotherMerge-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/ChatMarc-YesAnotherMerge-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T05:34:18.479696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarryFutureman/ChatMarc-YesAnotherMerge-7B\n\n\n\nDataset automatically created during the evaluation run of model BarryFutureman/ChatMarc-YesAnotherMerge-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-24T05:34:18.479696(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1c4a09e0ad78408f270c2acfbb7c6e4a2a1d476a |
# InternVid
## Dataset Description
- **Homepage:** [InternVid](https://github.com/OpenGVLab/InternVideo/tree/main/Data/InternVid)
- **Repository:** [OpenGVLab](https://github.com/OpenGVLab/InternVideo/tree/main/Data/InternVid)
- **Paper:** [2307.06942](https://arxiv.org/pdf/2307.06942.pdf)
- **Point of Contact:** mailto:[InternVideo]([email protected])
## InternVid-10M-FLT
We present InternVid-10M-FLT, a subset of this dataset, consisting of 10 million video clips, with generated high-quality captions for publicly available web videos.
## Download
The 10M samples are provided in jsonlines file. Columns include the videoID, timestamps, generated caption and their UMT similarity scores.\
## How to Use
```
from datasets import load_dataset
dataset = load_dataset("OpenGVLab/InternVid")
```
## Method

## Citation
If you find this work useful for your research, please consider citing InternVid. Your acknowledgement would greatly help us in continuing to contribute resources to the research community.
```
@article{wang2023internvid,
title={InternVid: A Large-scale Video-Text Dataset for Multimodal Understanding and Generation},
author={Wang, Yi and He, Yinan and Li, Yizhuo and Li, Kunchang and Yu, Jiashuo and Ma, Xin and Chen, Xinyuan and Wang, Yaohui and Luo, Ping and Liu, Ziwei and Wang, Yali and Wang, Limin and Qiao, Yu},
journal={arXiv preprint arXiv:2307.06942},
year={2023}
}
@article{wang2022internvideo,
title={InternVideo: General Video Foundation Models via Generative and Discriminative Learning},
author={Wang, Yi and Li, Kunchang and Li, Yizhuo and He, Yinan and Huang, Bingkun and Zhao, Zhiyu and Zhang, Hongjie and Xu, Jilan and Liu, Yi and Wang, Zun and Xing, Sen and Chen, Guo and Pan, Junting and Yu, Jiashuo and Wang, Yali and Wang, Limin and Qiao, Yu},
journal={arXiv preprint arXiv:2212.03191},
year={2022}
}
```
| OpenGVLab/InternVid-10M-FLT-INFO | [
"task_categories:feature-extraction",
"size_categories:10M<n<100M",
"language:en",
"license:cc-by-nc-sa-4.0",
"arxiv:2307.06942",
"region:us"
] | 2024-01-24T05:57:45+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["10M<n<100M"], "task_categories": ["feature-extraction"], "extra_gated_prompt": "You agree to not use the data to conduct experiments that cause harm to human subjects.", "extra_gated_fields": {"Name": "text", "Company/Organization": "text", "E-Mail": "text"}, "configs": [{"config_name": "InternVid-10M-FLT", "data_files": [{"split": "FLT", "path": "InternVid-10M-FLT-INFO.jsonl"}]}]} | 2024-01-24T06:02:00+00:00 | [
"2307.06942"
] | [
"en"
] | TAGS
#task_categories-feature-extraction #size_categories-10M<n<100M #language-English #license-cc-by-nc-sa-4.0 #arxiv-2307.06942 #region-us
|
# InternVid
## Dataset Description
- Homepage: InternVid
- Repository: OpenGVLab
- Paper: 2307.06942
- Point of Contact: mailto:InternVideo
## InternVid-10M-FLT
We present InternVid-10M-FLT, a subset of this dataset, consisting of 10 million video clips, with generated high-quality captions for publicly available web videos.
## Download
The 10M samples are provided in jsonlines file. Columns include the videoID, timestamps, generated caption and their UMT similarity scores.\
## How to Use
## Method
!Caption Method
If you find this work useful for your research, please consider citing InternVid. Your acknowledgement would greatly help us in continuing to contribute resources to the research community.
| [
"# InternVid",
"## Dataset Description\n\n- Homepage: InternVid\n- Repository: OpenGVLab\n- Paper: 2307.06942\n- Point of Contact: mailto:InternVideo",
"## InternVid-10M-FLT\n\nWe present InternVid-10M-FLT, a subset of this dataset, consisting of 10 million video clips, with generated high-quality captions for publicly available web videos.",
"## Download\n\nThe 10M samples are provided in jsonlines file. Columns include the videoID, timestamps, generated caption and their UMT similarity scores.\\",
"## How to Use",
"## Method\n\n!Caption Method\n\nIf you find this work useful for your research, please consider citing InternVid. Your acknowledgement would greatly help us in continuing to contribute resources to the research community."
] | [
"TAGS\n#task_categories-feature-extraction #size_categories-10M<n<100M #language-English #license-cc-by-nc-sa-4.0 #arxiv-2307.06942 #region-us \n",
"# InternVid",
"## Dataset Description\n\n- Homepage: InternVid\n- Repository: OpenGVLab\n- Paper: 2307.06942\n- Point of Contact: mailto:InternVideo",
"## InternVid-10M-FLT\n\nWe present InternVid-10M-FLT, a subset of this dataset, consisting of 10 million video clips, with generated high-quality captions for publicly available web videos.",
"## Download\n\nThe 10M samples are provided in jsonlines file. Columns include the videoID, timestamps, generated caption and their UMT similarity scores.\\",
"## How to Use",
"## Method\n\n!Caption Method\n\nIf you find this work useful for your research, please consider citing InternVid. Your acknowledgement would greatly help us in continuing to contribute resources to the research community."
] |
bd9f265f2fe41cafd3ab34a636e0687ca1503dc2 |
# Dataset of naga/ナガ/娜嘉/나가 (Nikke: Goddess of Victory)
This is the dataset of naga/ナガ/娜嘉/나가 (Nikke: Goddess of Victory), containing 82 images and their tags.
The core tags of this character are `breasts, bangs, large_breasts, hair_over_one_eye, long_hair, dark_skin, hair_ornament, brown_hair, dark-skinned_female, wrist_scrunchie, scrunchie, yellow_eyes, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 82 | 139.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 82 | 67.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 201 | 145.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 82 | 117.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 201 | 226.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/naga_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/naga_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blue_skirt, collared_shirt, looking_at_viewer, school_uniform, short_sleeves, simple_background, solo, white_background, white_shirt, navel, pleated_skirt, blush, jewelry, midriff, black_choker, crop_top, ear_piercing, thighs, breast_pocket, purple_scrunchie, thigh_strap, black_nails, lifted_by_self, skirt_lift, mouth_hold, parted_lips, smile, striped, cowboy_shot, hair_ribbon, miniskirt, purple_necktie, stomach |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_skirt | collared_shirt | looking_at_viewer | school_uniform | short_sleeves | simple_background | solo | white_background | white_shirt | navel | pleated_skirt | blush | jewelry | midriff | black_choker | crop_top | ear_piercing | thighs | breast_pocket | purple_scrunchie | thigh_strap | black_nails | lifted_by_self | skirt_lift | mouth_hold | parted_lips | smile | striped | cowboy_shot | hair_ribbon | miniskirt | purple_necktie | stomach |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-----------------|:--------------------|:-----------------|:----------------|:--------------------|:-------|:-------------------|:--------------|:--------|:----------------|:--------|:----------|:----------|:---------------|:-----------|:---------------|:---------|:----------------|:-------------------|:--------------|:--------------|:-----------------|:-------------|:-------------|:--------------|:--------|:----------|:--------------|:--------------|:------------|:-----------------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/naga_nikke | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-24T06:32:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-24T06:53:14+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of naga/ナガ/娜嘉/나가 (Nikke: Goddess of Victory)
====================================================
This is the dataset of naga/ナガ/娜嘉/나가 (Nikke: Goddess of Victory), containing 82 images and their tags.
The core tags of this character are 'breasts, bangs, large\_breasts, hair\_over\_one\_eye, long\_hair, dark\_skin, hair\_ornament, brown\_hair, dark-skinned\_female, wrist\_scrunchie, scrunchie, yellow\_eyes, brown\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
0921925cc8cec87ed83689031073882eaf57d55e |
# Dataset of tia/ティア/蒂亚/티아 (Nikke: Goddess of Victory)
This is the dataset of tia/ティア/蒂亚/티아 (Nikke: Goddess of Victory), containing 67 images and their tags.
The core tags of this character are `breasts, long_hair, blonde_hair, bangs, hair_ornament, large_breasts, bow, brown_eyes, mole, mole_under_eye, hairclip, huge_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 67 | 131.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tia_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 67 | 62.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tia_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 167 | 135.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tia_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 67 | 112.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tia_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 167 | 221.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tia_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tia_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, solo, white_thighhighs, cleavage, smile, white_shirt, thighs, blush, blue_jacket, bowtie, collared_shirt, ring, blue_skirt, button_gap, sitting, white_background, piercing, simple_background |
| 1 | 12 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, white_shirt, collared_shirt, cleavage, long_sleeves, smile, simple_background, white_background, ring, blue_bowtie, ear_piercing, open_mouth, red_eyes, black_bra, blue_jacket, heart, nail_polish, open_clothes, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | solo | white_thighhighs | cleavage | smile | white_shirt | thighs | blush | blue_jacket | bowtie | collared_shirt | ring | blue_skirt | button_gap | sitting | white_background | piercing | simple_background | blue_bowtie | ear_piercing | open_mouth | red_eyes | black_bra | heart | nail_polish | open_clothes | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------|:-------------------|:-----------|:--------|:--------------|:---------|:--------|:--------------|:---------|:-----------------|:-------|:-------------|:-------------|:----------|:-------------------|:-----------|:--------------------|:--------------|:---------------|:-------------|:-----------|:------------|:--------|:--------------|:---------------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | | X | X | X | | X | X | | X | X | | | | X | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/tia_nikke | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-24T06:47:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-24T07:04:06+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of tia/ティア/蒂亚/티아 (Nikke: Goddess of Victory)
====================================================
This is the dataset of tia/ティア/蒂亚/티아 (Nikke: Goddess of Victory), containing 67 images and their tags.
The core tags of this character are 'breasts, long\_hair, blonde\_hair, bangs, hair\_ornament, large\_breasts, bow, brown\_eyes, mole, mole\_under\_eye, hairclip, huge\_breasts, earrings', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2e85241120aefd3dfa825ca4b41d2c84742265f9 |
source:
- https://huggingface.co/datasets/FreedomIntelligence/HuatuoGPT-sft-data-v1
- https://huggingface.co/datasets/FreedomIntelligence/HuatuoGPT2_sft_instruct_GPT4_50K
转为sharegpt格式,jsonl文件。
data size:
```
> wc -l HuatuoGPT_sft_data_v1_sharegpt.jsonl
226042 HuatuoGPT_sft_data_v1_sharegpt.jsonl
> wc -l HuatuoGPT2_sft_instruct_GPT4_sharegpt.jsonl
50000 HuatuoGPT2_sft_instruct_GPT4_sharegpt.jsonl
```
转换代码:convert.py
```python
import json
# 假设您的JSONL文件名为 'input.jsonl'
input_file = './HuatuoGPT2_sft_instruct_GPT4.jsonl'
output_file = './HuatuoGPT2_sft_instruct_GPT4_sharegpt.jsonl'
# 初始化输出文件
with open(input_file, 'r', encoding='utf-8') as infile, open(output_file, 'w', encoding='utf-8') as outfile:
# 初始化输出的JSON结构
# 逐行读取JSONL文件
for id,line in enumerate(infile):
output_json = {"conversations": []}
# 解析JSON对象
data = json.loads(line.strip())
# if id > 10:
# break
# 假设每个JSON对象都有一个"data"列表,包含问题和答案
for i, item in enumerate(data['data']):
if i % 2 == 0: # 假设问题在偶数位置,答案在奇数位置
output_json['conversations'].append({
"from": "human",
"value": item[2:]
})
else:
output_json['conversations'].append({
"from": "gpt",
"value": item[2:]
})
# 将转换后的JSON写入文件
a = json.dumps(output_json, ensure_ascii=False)
outfile.write(a + '\n')
print(f"Conversion complete. Output saved to '{output_file}'.")
``` | shibing624/huatuo_medical_qa_sharegpt | [
"license:apache-2.0",
"region:us"
] | 2024-01-24T06:49:33+00:00 | {"license": "apache-2.0"} | 2024-01-29T04:03:31+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
source:
- URL
- URL
转为sharegpt格式,jsonl文件。
data size:
转换代码:URL
| [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
03ef5283840f0a54ff15224e5c92b8ac60304158 |
Dataset full description: https://www.kaggle.com/datasets/lockiultra/yandex-geo-reviews-embeddings
Dataset contains index column, 768 embedding columns and rating column. Each row corresponds to an embedding representation of the review text with same index.
| lockiultra/yandex-geo-reviews-embeddings | [
"task_categories:feature-extraction",
"size_categories:100K<n<1M",
"language:ru",
"license:mit",
"reviews",
"region:us"
] | 2024-01-24T06:52:04+00:00 | {"language": ["ru"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["feature-extraction"], "tags": ["reviews"]} | 2024-01-25T01:54:15+00:00 | [] | [
"ru"
] | TAGS
#task_categories-feature-extraction #size_categories-100K<n<1M #language-Russian #license-mit #reviews #region-us
|
Dataset full description: URL
Dataset contains index column, 768 embedding columns and rating column. Each row corresponds to an embedding representation of the review text with same index.
| [] | [
"TAGS\n#task_categories-feature-extraction #size_categories-100K<n<1M #language-Russian #license-mit #reviews #region-us \n"
] |
3bc8eeaf370cc7160c3d9d05c4b4df5e1be96230 | # Dataset Card for "JFLD_punipuni_monster"
See [here](https://github.com/hitachi-nlp/FLD-corpus.git) for the details of this corpus.
For the whole of the project, see [our project page](https://github.com/hitachi-nlp/FLD/).
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hitachi-nlp/JFLD_punipuni_monster | [
"region:us"
] | 2024-01-24T06:53:36+00:00 | {"dataset_info": [{"config_name": "D1", "features": [{"name": "version", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "hypothesis_formula", "dtype": "string"}, {"name": "facts", "dtype": "string"}, {"name": "facts_formula", "dtype": "string"}, {"name": "proofs", "sequence": "string"}, {"name": "proofs_formula", "sequence": "string"}, {"name": "negative_hypothesis", "dtype": "string"}, {"name": "negative_hypothesis_formula", "dtype": "string"}, {"name": "negative_proofs", "sequence": "string"}, {"name": "negative_original_tree_depth", "dtype": "int64"}, {"name": "original_tree_depth", "dtype": "int64"}, {"name": "depth", "dtype": "int64"}, {"name": "num_formula_distractors", "dtype": "int64"}, {"name": "num_translation_distractors", "dtype": "int64"}, {"name": "num_all_distractors", "dtype": "int64"}, {"name": "proof_label", "dtype": "string"}, {"name": "negative_proof_label", "dtype": "string"}, {"name": "world_assump_label", "dtype": "string"}, {"name": "negative_world_assump_label", "dtype": "string"}, {"name": "prompt_serial", "dtype": "string"}, {"name": "proof_serial", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 114387542, "num_examples": 30000}, {"name": "validation", "num_bytes": 19240284, "num_examples": 5000}, {"name": "test", "num_bytes": 18756246, "num_examples": 5000}], "download_size": 50441531, "dataset_size": 152384072}, {"config_name": "D1_minus", "features": [{"name": "version", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "hypothesis_formula", "dtype": "string"}, {"name": "facts", "dtype": "string"}, {"name": "facts_formula", "dtype": "string"}, {"name": "proofs", "sequence": "string"}, {"name": "proofs_formula", "sequence": "string"}, {"name": "negative_hypothesis", "dtype": "null"}, {"name": "negative_hypothesis_formula", "dtype": "null"}, {"name": "negative_proofs", "sequence": "null"}, {"name": "negative_original_tree_depth", "dtype": "null"}, {"name": "original_tree_depth", "dtype": "int64"}, {"name": "depth", "dtype": "int64"}, {"name": "num_formula_distractors", "dtype": "int64"}, {"name": "num_translation_distractors", "dtype": "int64"}, {"name": "num_all_distractors", "dtype": "int64"}, {"name": "proof_label", "dtype": "string"}, {"name": "negative_proof_label", "dtype": "null"}, {"name": "world_assump_label", "dtype": "string"}, {"name": "negative_world_assump_label", "dtype": "null"}, {"name": "prompt_serial", "dtype": "string"}, {"name": "proof_serial", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22931353, "num_examples": 30000}, {"name": "validation", "num_bytes": 3834131, "num_examples": 5000}, {"name": "test", "num_bytes": 3803197, "num_examples": 5000}], "download_size": 8882918, "dataset_size": 30568681}, {"config_name": "D3", "features": [{"name": "version", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "hypothesis_formula", "dtype": "string"}, {"name": "facts", "dtype": "string"}, {"name": "facts_formula", "dtype": "string"}, {"name": "proofs", "sequence": "string"}, {"name": "proofs_formula", "sequence": "string"}, {"name": "negative_hypothesis", "dtype": "string"}, {"name": "negative_hypothesis_formula", "dtype": "string"}, {"name": "negative_proofs", "sequence": "string"}, {"name": "negative_original_tree_depth", "dtype": "int64"}, {"name": "original_tree_depth", "dtype": "int64"}, {"name": "depth", "dtype": "int64"}, {"name": "num_formula_distractors", "dtype": "int64"}, {"name": "num_translation_distractors", "dtype": "int64"}, {"name": "num_all_distractors", "dtype": "int64"}, {"name": "proof_label", "dtype": "string"}, {"name": "negative_proof_label", "dtype": "string"}, {"name": "world_assump_label", "dtype": "string"}, {"name": "negative_world_assump_label", "dtype": "string"}, {"name": "prompt_serial", "dtype": "string"}, {"name": "proof_serial", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 130918768, "num_examples": 30000}, {"name": "validation", "num_bytes": 21585277, "num_examples": 5000}, {"name": "test", "num_bytes": 21831791, "num_examples": 5000}], "download_size": 57632661, "dataset_size": 174335836}, {"config_name": "D8", "features": [{"name": "version", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "hypothesis_formula", "dtype": "string"}, {"name": "facts", "dtype": "string"}, {"name": "facts_formula", "dtype": "string"}, {"name": "proofs", "sequence": "string"}, {"name": "proofs_formula", "sequence": "string"}, {"name": "negative_hypothesis", "dtype": "string"}, {"name": "negative_hypothesis_formula", "dtype": "string"}, {"name": "negative_proofs", "sequence": "string"}, {"name": "negative_original_tree_depth", "dtype": "int64"}, {"name": "original_tree_depth", "dtype": "int64"}, {"name": "depth", "dtype": "int64"}, {"name": "num_formula_distractors", "dtype": "int64"}, {"name": "num_translation_distractors", "dtype": "int64"}, {"name": "num_all_distractors", "dtype": "int64"}, {"name": "proof_label", "dtype": "string"}, {"name": "negative_proof_label", "dtype": "string"}, {"name": "world_assump_label", "dtype": "string"}, {"name": "negative_world_assump_label", "dtype": "string"}, {"name": "prompt_serial", "dtype": "string"}, {"name": "proof_serial", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 166584697, "num_examples": 30000}, {"name": "validation", "num_bytes": 27762073, "num_examples": 5000}, {"name": "test", "num_bytes": 27797559, "num_examples": 5000}], "download_size": 71866505, "dataset_size": 222144329}], "configs": [{"config_name": "D1", "data_files": [{"split": "train", "path": "D1/train-*"}, {"split": "validation", "path": "D1/validation-*"}, {"split": "test", "path": "D1/test-*"}]}, {"config_name": "D1_minus", "data_files": [{"split": "train", "path": "D1_minus/train-*"}, {"split": "validation", "path": "D1_minus/validation-*"}, {"split": "test", "path": "D1_minus/test-*"}]}, {"config_name": "D3", "data_files": [{"split": "train", "path": "D3/train-*"}, {"split": "validation", "path": "D3/validation-*"}, {"split": "test", "path": "D3/test-*"}]}, {"config_name": "D8", "data_files": [{"split": "train", "path": "D8/train-*"}, {"split": "validation", "path": "D8/validation-*"}, {"split": "test", "path": "D8/test-*"}]}]} | 2024-01-31T11:14:48+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "JFLD_punipuni_monster"
See here for the details of this corpus.
For the whole of the project, see our project page.
More Information needed | [
"# Dataset Card for \"JFLD_punipuni_monster\"\n\nSee here for the details of this corpus.\nFor the whole of the project, see our project page.\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"JFLD_punipuni_monster\"\n\nSee here for the details of this corpus.\nFor the whole of the project, see our project page.\n\nMore Information needed"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.